Configure cloud-based vscode ide on AWS

3/5/2022

CONTEXT: We have a platform where users can create their own projects - multiple projects per user. We need to provide them with a browser-based IDE to edit those projects. We decided to go with coder-server. For this we need to configure an auto-scalable cluster on AWS. When the user clicks "Edit Project" we will bring up a new container each time. https://hub.docker.com/r/codercom/code-server

QUESTION: How to pass parameters from the url query (my-site.com/edit?project=1234) into a startup script to pre-configure the workspace in a docker container when it starts?

Let's say the stack is AWS + ECS + Fargate. We could use kubernetes instead of ECS if it helps.

I don't have any experience in cluster configuration. Will appreciate any help or at least a direction where to dig further.

-- Sergei
amazon-elb
amazon-fargate
amazon-web-services
code-server
kubernetes

1 Answer

3/12/2022

The above can be achieved using multiple ways in AWS ECS. The basic requirements for such systems are to launch and terminate containers on the fly while persisting the changes in the files. (I will focus on launching the containers)

Using AWS SDK's:

The task can be easily achieved using AWS SDKs, Using a base task definition. AWS SDK allows starting tasks with overrides on the base task definition.

E.G. If task definition has a memory of 2GB then the SDK can override the memory to parameterised value while launching a task from task def.

Refer to the boto3 (AWS SDK for Python) docs.

https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/ecs.html#ECS.Client.run_task

Overall Solution

Now that we know how to run custom tasks with python SDK (on demand). The overall flow for your application is your API calling AWS lambda function whit parameters to spin up and wait to keep checking task status and update and rout traffic to it once the status is healthy.

  1. API calls AWS lambda functions with parameters
  2. Lambda function using AWS SDK create a new task with overrides from base task definition. (assuming the base task definition already exists)
  3. Keep checking the status of the new task in the same function call and set a flag in your database for your front end to be able to react to it.
  4. Once the status is healthy you can add a rule in the application load balancer using AWS SDK to route traffic to the IP without exposing the IP address to the end client (AWS application load balancer can get expensive, I'll advise using Nginx or HAProxy on ec2 to manage dynamic routing)

Note:

Ensure your Image is lightweight, and the startup times are less than 15 mins as lambda cannot execute beyond that. If that's the case create a microservice for launching ad-hoc containers and hosting them on EC2

Using Terraform:

If you looking for infrastructure provisioning terraform is the way to go. It has a learning curve so recommend it as a secondary option.

Terraform is popular for parametrising using variables and it can be plugged in easily as a backend for an API. The flow of your application still remains the same from step 1, but instead of AWS Lambda API will be calling your ad-hoc container microservice, which in turn calls terraform script and passing variables to it.

Refer to the Terrafrom docs for AWS

https://registry.terraform.io/providers/hashicorp/aws/latest

-- Nick
Source: StackOverflow