How to execute scripts to import csv files inside a postgres deployed on a Kubernetes cluster?

1/15/2019

I'm trying to deploy Postgres on my Kubernetes cluster and I have been successful to do this, but then I don't know how I can import my data which are in csv format. I already have the scripts which gets the path to data and create a database in a local instance of postgres, but when I deploy postgres on Kubernetes cluster then those scripts wont work because I can't see those script inside the pod.

I was looking for a solution to execute the scripts from host to inside the pod, or I can expose the directory of scripts and data to postgres pod. I've found the hostpath solution, but I don't know how to define multiple volumes for a deployment. (I'm using Rook cluster to provision the volume) Maybe a way to define a hostpath volume alongside a Rook volume so I can have access to the scripts and csv files inside the hostpath and then create the database inside the Rook volume. I don't know of this makes sense, but I would appreciate if someone help me with this.

-- Fatemeh Rouzbeh
docker-volume
kubernetes
persistent-volumes
postgresql

1 Answer

1/16/2019

If you're using the official docker image, or an image that is derived from it but didn't destroy its entrypoint, then they have documentation about /docker-entrypoint-initdb.d/*.sql, with the tl;dr as

kind: ConfigMap
spec:
  import_csv.sql: |
    COPY my_table FROM '/whatever/path/you/want.csv' FORMAT csv /* etc */
---
kind: Pod
spec:
  containers:
  - volumeMounts:
    - name: my-initdb-configmap
      mountPath: /docker-entrypoint-initdb.d
      readOnly: true
    # ...

type deal

-- mdaniel
Source: StackOverflow