Cannot send data to Minio using Argo workflow running on Minikube

4/24/2019

I'm testing Argo workflow on Minikube and I'm using Minio to upload/download data created within the workflow. And when I submit the template yaml, I got failed to save outputs error on the pod.

I checked the logs using kubectl logs -n air [POD NAME] -c wait, the result is below.

time="2019-04-24T04:25:27Z" level=info msg="Creating a docker executor"
time="2019-04-24T04:25:27Z" level=info msg="Executor (version: v2.2.1, build_date: 2018-10-11T16:27:29Z) and goes on and on
time="2019-04-24T04:25:27Z" level=info msg="Waiting on main container"
time="2019-04-24T04:25:29Z" level=info msg="main container started with container ID: 86afd5f5a35fbea3fcd65fdf565f8194d79535034d94548bb371681faf549e6e"
time="2019-04-24T04:25:29Z" level=info msg="Starting annotations monitor"
time="2019-04-24T04:25:29Z" level=info msg="docker wait 86afd5f5a35fbea3fcd65fdf565f8194d79535034d94548bb371681faf549e6e"
time="2019-04-24T04:25:29Z" level=info msg="Starting deadline monitor"
time="2019-04-24T04:25:33Z" level=info msg="Main container completed"
time="2019-04-24T04:25:33Z" level=info msg="No sidecars"
time="2019-04-24T04:25:33Z" level=info msg="Saving output artifacts"
time="2019-04-24T04:25:33Z" level=info msg="Saving artifact: get-data"
time="2019-04-24T04:25:33Z" level=info msg="Archiving 86afd5f5a35fbea3fcd65fdf565f8194d79535034d94548bb371681faf549e6e:/data/ to /argo/outputs/artifacts/get-data.tgz"
time="2019-04-24T04:25:33Z" level=info msg="sh -c docker cp -a 86afd5f5a35fbea3fcd65fdf565f8194d79535034d94548bb371681faf549e6e:/data/ - | gzip > /argo/outputs/artifacts/get-data.tgz"
time="2019-04-24T04:25:33Z" level=info msg="Annotations monitor stopped"
time="2019-04-24T04:25:34Z" level=info msg="Archiving completed"
time="2019-04-24T04:25:34Z" level=info msg="Creating minio client 192.168.99.112:31774 using IAM role"
time="2019-04-24T04:25:34Z" level=info msg="Saving from /argo/outputs/artifacts/get-data.tgz to s3 (endpoint: 192.168.99.112:31774, bucket: reseach-bucket, key: /data/)"
time="2019-04-24T04:25:34Z" level=info msg="Deadline monitor stopped"
time="2019-04-24T04:26:04Z" level=info msg="Alloc=3827 TotalAlloc=11256 Sys=9830 NumGC=4 Goroutines=7"
time="2019-04-24T04:26:04Z" level=fatal msg="Get http://169.254.169.254/latest/meta-data/iam/security-credentials: dial tcp 169.254.169.254:80: i/o and goes on and on 

And the template yaml file looks like this:

apiVersion: argoproj.io/v1alpha1
kind: Workflow
...

########################################
  - name: template-data-handling
    activeDeadlineSeconds: 10800
    outputs:
      artifacts:
      - name: get-data
        path: /data/
        s3:
          endpoint: 192.168.99.112:31774
          bucket: reseach-bucket
          key: /data/
          secretKeySecret:
            name: minio-credentials
            key: accesskey
          secretKeySecret:
            name: minio-credentials
            key: secretkey
    retryStrategy:
      limit: 1
    container:
      image: demo-pipeline
      imagePullPolicy: Never
      command: [/bin/sh, -c]
      args:
        - |
          python test.py

Could someone help?

-- user3368526
argo-workflows
argoproj
kubernetes
minio

1 Answer

4/30/2019

do you create minio-credentials secret which has secretkey and accesskey on the namespace where the workflow is running?

Example: Argo controller pod is running on argo namespace. workflow template is submitting in default namespace. minio-credentials secret should be available in default namespace.

-- Sarabala
Source: StackOverflow