0

I am working on getting a legacy application up and running. It is using Python 2.7, I have managed to get it Dockerized and deployed to ECS using Fargate.

However the legacy code is using boto 2.49.0 (not boto3), and is failing to get the credentials as it appears to be hitting an instance metadata endpoint that doesn't exist on ECS (Fargate?).

curl http://169.254.169.254/latest/meta-data/iam/security-credentials/
curl: (7) Couldn't connect to server

Stack trace with:

import boto
boto.connect_s3(debug=2)
DEBUG:boto:http://169.254.169.254/latest/meta-data/iam/security-credentials/
2024-06-17 20:51:37,294 boto [ERROR]:Caught exception reading instance data
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/boto/utils.py", line 218, in retry_url
    r = opener.open(req, timeout=timeout)
  File "/usr/lib/python2.7/urllib2.py", line 429, in open
    response = self._open(req, data)
  File "/usr/lib/python2.7/urllib2.py", line 447, in _open
    '_open', req)
  File "/usr/lib/python2.7/urllib2.py", line 407, in _call_chain
    result = func(*args)
  File "/usr/lib/python2.7/urllib2.py", line 1235, in http_open
    return self.do_open(httplib.HTTPConnection, req)
  File "/usr/lib/python2.7/urllib2.py", line 1205, in do_open
    raise URLError(err)
URLError: <urlopen error [Errno 22] Invalid argument>
ERROR:boto:Caught exception reading instance data
2024-06-17 20:51:37,295 boto [ERROR]:Unable to read instance data, giving up

I know for certain the ECS task role credentials are available at the URI in the environment variables:

curl 169.254.170.2$AWS_CONTAINER_CREDENTIALS_RELATIVE_URI
{"RoleArn": ..., "AccessKeyId":...

Is it possible for me to somehow "point" the old boto library to this endpoint? I can't find any configuration for it. I tried getting it working with

output=$(curl 169.254.170.2$AWS_CONTAINER_CREDENTIALS_RELATIVE_URI | head -n 1| cut -d $' ' -f2)
export AWS_ACCESS_KEY_ID=$(jq -r .AccessKeyId <<< "$output")
export AWS_SECRET_ACCESS_KEY=$(jq -r .SecretAccessKey <<< "$output")
export AWS_SECURITY_TOKEN=$(jq -r .Token <<< "$output")

But am struggling to think of how I can refresh the credentials, and I don't want to attach long lived credentials to my ECS, but may have to resort to doing so, or monkeypatch boto2.

Is there a way I can configure this boto version to load credentials from a different endpoint?

1 Answer 1

0

Patching boto2 or adding long lived credentials seem like the easiest alternatives, especially static credentials. However, if you are looking for another workaround..

Similar to boto3, the old boto looks for credentials in ~/.aws/credentials. This is where your static credentials would go. Exporting the env variables does not work because the ecs container will evaluate them at spin up, as per task definition. So now you need a script which will:

  1. get the credentials
  2. put them in the ~/.aws/credentials
  3. since these are temporary credentials, set up a cron job which will update these regularly

Taking inspiration from how moto creates fake credentials:

mkdir ~/.aws && touch ~/.aws/credentials && echo -e "[default]\naws_access_key_id = $AWS_ACCESS_KEY_ID\naws_secret_access_key = AWS_SECRET_ACCESS_KEY\naws_session_token = $AWS_SECURITY_TOKEN" > ~/.aws/credentials

Not the answer you're looking for? Browse other questions tagged or ask your own question.