I've got a basic architecture set up in Kubernetes, a Laravel container for my application level code, and a Mysql container for my database. And I'm looking to implement a code compiling API service (as a seperate container) which accepts user generated code which I then run a docker container to compile the code and return the output to the user.
There's some pretty raw implementations online but most of them use docker as a method of running user generated code compiling in an isolated environment (as you should) but the application itself is not hosted using containers or a container management system.
Questions is, how can I spin up docker containers to handle a task and then return the output to my Laravel API container before shutting the container down?
Apparently, running a docker container inside a docker container is not best practice.
The process:
I'm running my app in a Kubernetes cluster, and a Docker/Kubernetes solution is needed. I rather not have to run raw commands of starting a Docker container in my application level code but have a more higher level solution.
You can use Kubernetes job resource to perform this kind of task.
The Jobs objects can be spawned to run a process inside and can be set to be automatically terminated afterwards. A job in Kubernetes is a supervisor for pods carrying out batch processes, that is, a process that runs for a certain time to completion. You are able to run multiple pod instances inside one job (parallel or sequentially).
Check this page for more details about the jobs.
So basically your flow should look like this:
The delivery of the binary should be coded by the user
This documentation link shows how to connect to the API, especially the section Accessing the API from a Pod