I am using Claudia.js (http://claudia.js) to deploy my Lambda functions to AWS and link them to my API Gateway or S3 bucket. Real easy solution. No extra YML or learning process. Just an npm package installed globally and hooked into your package.json scripts. But if you use your local machine to deploy Lambda, you might run into some problems with the artifacts not matching AWS environment. It may not be a problem if all your dependencies are pure JavaScript; but, if you have any dependencies that use low level C/C++ add ons (eg. https://github.com/lovell/sharp) you will face problems.

Below simple lambda function will not run (assuming your local machine is not compatible with Amazon) if you add sharp library to your dependencies and upload the zip (or deploy using Claudia)

npm install --save sharp
const sharp = require('sharp')

exports.handler = async event => {
    console.log('Hello')
};

You will get below error:

module initialization error: Error
at Object.hasVendoredLibvips (/var/task/node_modules/sharp/lib/libvips.js:61:13)
at Object.<anonymous> (/var/task/node_modules/sharp/lib/constructor.js:9:22)
at Module._compile (module.js:652:30)
at Object.Module._extensions..js (module.js:663:10)
at Module.load (module.js:565:32)
at tryModuleLoad (module.js:505:12)
at Function.Module._load (module.js:497:3)
at Module.require (module.js:596:17)
at require (internal/module.js:11:18)
at Object.<anonymous> (/var/task/node_modules/sharp/lib/index.js:3:15)

It's also always good to compile/transpile the code with the same Node version as AWS Lambda has. Using EC2 or virtual machine would be the most perfect solution if it wasn't impractical (and it will cost too). We will use Docker. At the moment AWS is using Node 8.10. So, we will use an image that has this version. This image mimics the live AWS Lambda environment.

https://hub.docker.com/r/lambci/lambda/

Since Claudia.js will work inside the Docker, we will need to pass AWS credentials to Docker. In my local machine credentials are stored in  ~/.aws/credential , but in the image it's under /root/.aws/credential. So, when I will create a volume that will do this mapping. Also, I need to copy all my working documents (including package.json and all my code) into Docker. I can also create volume for this. I will map my working folder ($PWD) into /claudia folder inside the container.

Create a bash script (eg. claudia-deploy.sh) and update it's content like below:

docker run -v $PWD:/claudia -v $HOME/.aws:/root/.aws --rm lambci/lambda:build-nodejs8.10 /bin/bash -c "\
cd /claudia
rm -rf node_modules
npm install -g claudia
npm run claudia-deploy
"

And make this script runnable

chmod +x claudia-deploy.js

Add below scripts inside package.json

{
"scripts": {
    "claudia-deploy": "claudia create --no-optional-dependencies --name my-aws-function --handler index.handler",
    "claudia-update": "claudia update --no-optional-dependencies",
    "deploy": "./claudia-deploy.sh",
    "update": "./claudia-update.sh",
  }
}

You should not run npm run claudia-deploy or npm run claudia-update in our local machine. It needs to be run inside the container. We need to execute npm run claudia-deploy inside our local machine and it will run the claudia scripts inside the container.

npm run build

For update, we need another bash script (claudia-update.sh)

docker run -v $PWD:/claudia -v $HOME/.aws:/root/.aws --rm lambci/lambda:build-nodejs8.10 /bin/bash -c "\
cd /claudia
rm -rf node_modules
npm install -g claudia
npm run claudia-update
"

Then run

npm run update

Using this method we make sure that all the artifacts that we generate are fully compatible with AWS. Like I said before, if you're not using any dependencies or your dependencies are simple JS libraries you'll most likely be OK. But, if you use libraries like sharp (to reduce the size of the images uploaded into S3 bucket), you need a solution like this.