Handling iOS image uploads with Paperclip
If any of you happen to do some backend work and iOS work at the same time or work on a team that does, this might be a helpful tip for you:
Let's say you have a Rails app that uses the Paperclip gem by Thoughtbot to handle file uploads.
Then let's say you have an iOS app that wants to upload photos to the server. You've tried building a multi-part request with something like AFNetworking since Alamofire doesn't support multi-part requests and
NSURLSession isn't quite convenient enough to do it alone.
In your Rails code, you'll do something like
@photo.image = params[:image]
and you'll get a vague error about invalid characters.
The problem here is, Rails has failed to properly detect the file in your multi-part request and then Paperclip can't do its job either. You're trying to send some
JSON parameters and an image file and you're doing it the wrong way. Here's how you fix it in three easy steps:
- Base64 encode the
NSDataobject representing your
- Send it just like any other JSON parameter because now it's a string.
- In your Rails code, decode the Base64 value like so:
@photo.image = StringIO.new(Base64.decode64(params[:image]))
If you dig through Paperclip's code, you'll find that an attachment (your photo) has a number of handlers for turning input into a complete attachment. This makes use of the
StringIO handler which writes out the data as a temporary file for Paperclip's internals to process. If you search around the internet for this problem, you'll find solutions like this one that suggests manually writing files yourself, but you don't have to bother! The smart people working on Paperclip have already considered this use-case and simply didn't document it well enough.