Building an image processor on AWS Lambda using The Serverless Framework

Good writeup on setting up an image processor using The Serverless Framework, a thing comparable to the aforementioned apex (and with an awfully generic and confusing name imho 😉).

  1. When a user uploads a file an ObjectCreated event is produced and a Lambda function is invoked.
  2. The Lambda function calls Amazon Rekognition to detect the faces and emotion of each face in the uploaded image.
  3. The Lambda function processes the image and persists the image in Amazon S3

Here’s an example set of results:

The code of the Lambda function that calls Amazon Rekognition and processes uploaded images is available on GitHub.

How to build powerful back-ends easily with Serverless →

Apex – Serverless Infrastructure

logo

Apex lets you build, deploy, and manage AWS Lambda functions with ease. A variety of workflow related tooling is provided for testing functions, rolling back deploys, viewing metrics, tailing logs, hooking into the build system and more.

Define a project.json as so …

{
  "name": "bar",
  "description": "Node.js example function",
  "runtime": "nodejs",
  "memory": 128,
  "timeout": 5,
  "role": "arn:aws:iam::293503197324:role/lambda"
}

… and then place your function inside the functions/{name}/ folder, and finally deploy it by running apex deploy.

Apex augments AWS Lambda by supporting more languages than Node(JS)/Python/Java:

With Apex you can use languages that are not natively supported by AWS Lambda, such as Golang, through the use of a Node.js shim injected into the build.

Apex – Serverless Infrastructure →

The $2375 Amazon AWS mistake

aws_logo

When I got to GitHub, I checked my application.yml, and it was online with my [Amazon S3] API keys… Crap! I reverted the last few commits, and deleted all traces from GitHub. I was able to clean it up within about 5 minutes and no one else knew about the repo. After a close call, I went to bed.

When I woke up the next morning, I had four emails from Amazon AWS and a missed phone call from Amazon AWS. Something about 140 servers running on my AWS account. What? How? I only had S3 keys on my GitHub and they where gone within 5 minutes!

Let this be a lesson to treat your API keys/tokens/etc. like your passwords: never expose them. And if they do get exposed – even for just a little while – change them all.

My $2375 Amazon EC2 Mistake →

AWS Resource APIs for PHP

<?php
 
require 'vendor/autoload.php';
 
use Aws\Resource\Aws;
 
$aws = new Aws([
    'region'  => 'us-west-2',
    'version' => 'latest',
    'profile' => 'your-credential-profile',
]);

$bucket = $aws->s3->bucket('your-bucket');

$object = $bucket->putObject([
    'Key'  => 'images/image001.jpg',
    'Body' => fopen('/path/to/image.jpg', 'r'),
]);

The core AWS SDK for PHP is composed of service client objects that have methods corresponding 1-to-1 with operations in the service’s API. This project builds build upon the SDK to add new types of objects that allow you to interact with the AWS service APIs in a more resource-oriented way. This allows you to use a more expressive syntax when working with AWS services, because you are acting on objects that understand their relationships with other resources and that encapsulate their identifying information.

Yes!

Preview the AWS Resource APIs for PHP →
AWS Resource APIs for PHP (GitHub) →

Amazon Prime Air

Get packages into customers’ hands in 30 minutes or less using unmanned aerial vehicles

Still remove a few years from us. Why announce now? This quote, by Hiten Shah, I can relate to:

This is a pre-emptive move to help the world go in the way Amazon wants it to go. Perhaps to help push the FAA rules and regulations forward faster.

Man DDOSes his S3 bucket by adding its images in a Google Spreadsheet

I login to my AWS account to see what is going on, and I see this: $1177.76 in usage charges! A thousand, one hundred, seventy seven dollars. Out of which $1065 in outgoing bandwidth transfer costs. The scary part: 8.8 Terabytes of outgoing traffic! Tera. Not Giga. Terabytes.

To make things worse, I realized that the cost was going up hour after hour. Fifty to hundred dollars more in billing charges with each. passing. hour. I started sweating.

The Google attack: How I attacked myself using Google Spreadsheets and I ramped up a $1000 bandwidth bill →

(via @codepo8)