Will's avatar

⬅️ See more posts

AJAX + Python + Amazon S3

5 April 2013 (2 minute read)

🔮 This post is also available via Gemini.

python aws s3 technology

I wanted a way in which users can seamlessly upload images for use in the Heroku application discussed in previous posts.

Ideally, the image would be uploaded through AJAX as part of a data-entry form, but without having to refresh the page or anything else that would disrupt the user’s experience. As far as I know, barebones JQuery does not support AJAX uploads, but this handy plugin does.

Handling the upload (AJAX)

styled the file input nicely (in a similar way to this guy) and added the JS so that the upload is sent properly (and to the appropriate URL) when a change is detected to the input (i.e. the user does not need to click the ‘upload’ button to start the upload).

Receiving the upload (Python)

he backend, as previously mentioned, is written in Python as part of a Flask app. Since Heroku’s customer webspace is read-only, uploads would have to be stored elsewhere. Boto’s a cool library for interfacing with various AWS products (including S3) and can easily be installed with pip install boto. From this library, we’re going to need the S3Connection and Key classes:

from boto.s3.connection import S3Connection
from boto.s3.key import Key

Now we can easily handle the transfer using the request object exposed to Flask’s routing methods:

file = request.files['file_input_name']
con = S3Connection(<'AWS_KEY'>, <'AWS_SECRET'>)
key = Key(con.get_bucket(<'BUCKET_NAME'>))
key.set_contents_from_file(file)

Go to the next step for the AWS details and the bucket name. Depending on where you chose your AWS location as (e.g. US, Europe, etc.), then your file will be accessible as something like https://s3-eu-west-1.amazonaws.com/<BUCKET_NAME>/<FILENAME>. If you want, you can also set, among other things, stuff like the file’s mime type and access type:

key.set_metadata('Content-Type', 'image/png')
key.set_acl('public-read')

Setting up the bucket (Amazon S3)

Finally you’ll need to create the bucket. Create or log into your AWS account, go to the AWS console, choose your region (if you’re in Europe, then the Ireland one is probably the best choice) and enter the S3 section. Here, create a bucket (the name needs to be globally unique). Now, go to your account settings page to find your AWS access key and secret and plug these, along with the bucket name, into the appropriate places in your Python file.

And that’s it. For large files, this may tie up your Heroku dynos a bit while they carry out the upload, so this technique is best for smaller files (especially if you’re only using the one web dyno). My example of a working implementation of this is available in this file.

✉️ You can reply to this post via email.

📲 Subscribe to updates

If you would like to read more posts like this, then you can subscribe via RSS.