I'm looking for an efficient and easy way to adapt my current Theano model so it can scale for prediction. I'm also looking for a way to easily train lots of models with different parameters.
It's seems that there is two main ways to do it. The first is to use Spark and the second is to use Docker and Kubernetes.
My experience with both is fairly limited, so, I have no idea if there are correct way to solve my problem and what are the differences between each solutions.
That is two thing between Kuberbetes and Spark, Kubernets is a Paas, it provide you the platform to run your application, Spark is used to run your algorithm and compute distributed ,but you need to build Spark in a cluster So kubernetes can help you to do this things
How to build Spark with kubernetes? You can see the reference
Good Luck!