Author: GOCO Copenhagen
You will need to define a named AWS PROFILE, with administration rights, using the aws cli 2
. You can use this link to directly create the new profile.
The profile for this example is named devprofile
and only used in the package.json
. If you wish to change the name of your local profile you will only need to change the name here and give the profile another name, when you create it on the AWS Console.
This service creates AWS databases and API's the ServerLess framework.
Serverless is a framework to build unified serverless application in various programming languages and using various service providers.
You will need to create this file yourself.
env.yml
file to github!
prod:
testEnv: "This is a test env on production"
...
default:
testEnv: "This is a test env on development/catch all"
awsLocalDynammoAccessKey: "special_access_key"
awsLocalDynammoSecretAccessKey: "special_secret_access_key"
...
Please notice the awsLocalDynammoAccessKey
and the awsLocalDynammoSecretAccessKey
. These two variables are used for the local environment (otherwise the local version will not run) and you should define these variables by generating a local dynammodb key. You can use this direct link to the specific AWS IAM role.
Working with serverless.yml
π¦
This is where the magic happens.
npm run test
Executes everything that has been wrapped in thefunction test(:callback)
npm starts
Starts the api locally on port:3000
npm run deploy-dev
Deploys the API to the development environment using the AWS named profilenpm run deploy-database-dev
Deploys the Database to the development environment using the AWS named profile
- Install serverless using
npm i -g serverless
or equivalent - Create the
env.yml
file npm run deploy-database-dev
npm starts
- Goto
localhost:3000
- Congratulations π π€©
- You can test if your api works using the
mirror
endpoint in your browser. - You can also test if your connection to the database works, using the
createandlist/{id}
endpoint in your browser.(Notice you should use a number as id reference)
IAM (Identity Access Management) allows us to restrict and define access levels on AWS.
IAM Roles are widely used so this can be overwhelming at first sight. However, the scope we are using IAM for in serverless.yml
, is to our service access to other services. Fx we want our function to have access to our database.
We are supposed to only give access to the services we actually use. Fx in this example we give the function access to all resources, but only the sendTemplatedEmail, within SES. In this other example we only give access to the ${self:custom.stage}-TestTableArn
table and we only allow our service to do simple CRUD operations with the dynamodb table
*
: Select everythingses:*
Effect
: What we want to doAllow/Deny
Action
: The services this statement applies to. Fxdynamodb:DescribeTable
Arn
: Amazon Resource Name.Fn::ImportValue
: Returns an output generated from another stack (Fx from the database schema stack)
serverless.yml/custom
is used as a top level. Here we will have stages, regions and other global service variables.
env.yml
should be used for external api keys and other variables outside the service scope.
serverless.yml/provider.environment
are the service level environment variables. Here we will define the variables we want to include in this service. We need to include all the custom
and env.yml
variables here. This might seem as double work, but this gives security advantages when we have multiple service scopes nested in a service.
This serverless frameworks will build a REST Https API Service and after your first deploy, this service will be fully available via a AWS-domain.
The API is a subset of functions, which can be available through our custom routes. We define our functions in the serverless.yml
.
All functions must have a name, which is used as a reference on the CloudWatch- and Lambda console.
Example from serverless.yml
mirror:
handler: api/mirror.main
events:
- http:
path: mirror
method: get
- http:
path: mirror
method: put
- schedule:
rate: rate(24 hours)
The handler is the path of the function we will execute. please notice the main
in this example. Usually you will use names as get
, create
or update
.
Please remember it is possible to export multiple functions in the same file. This will be specified in the CRUD part.
Event's are the triggers to specific functions.
When creating a REST API we will usually use the http
reference, nested with HTTP Methods and paths to define the route for this endpoint.
You can also use the schedule
reference to run a cron job or any of these event types.
This boilerplate implements a Handler Library which all functions should be wrapped in. This gives a great level of consistency.
import { Context, APIGatewayEvent } from "aws-lambda";
import handler from "libs/handler-lib";
import { YOUR_RESPONSE_TYPE } from "./responeTypes";
export const get = handler(async (event: APIGatewayEvent, context: Context): Promise<YOUR_RESPONSE_TYPE> => {
// YOUR LOGIC GOES HERE
return ({ ... })
})
You should replace YOUR_RESPONSE_TYPE
with your own response type based on whatever project you are working on.
TBA SOON
TBA SOON
WORK IN PROGRESS
- Under normal circumstances a DevOps will have local databases to work on. For this example we use an online database. *
For this example we use an online database.
You can spawn new database (only if necessary) using deploy with a custom stage (format: dev-*****). However, the first time you run deploy-dev, a database on stage dev will be created.
TBA SOON
TBA SOON