Using a PYthon SDK to publish messages to GCP Pub/SUb. THe code is running inside a Kubernetes POD on GKE.
import pymysql
import os
import argparse
import time
from google.cloud import pubsub_v1
entries = ['jelly']
def publish_messages(project, topic_name):
publisher = pubsub_v1.PublisherClient()
topic_path = publisher.topic_path(project, topic_name)
for n in entries:
data = u'Message number {}'.format(n)
data = data.encode('utf-8')
publisher.publish(topic_path, data=data)
print "Message %s sent to queue" % n
Script works fine when executed manually. However, it fails when triggered via Crontab.
Error: No handlers could be found for logger google.cloud.pubsub_v1.publisher._batch.thread"
Found the solution. Crontab by default doesn't read from the system environment variables. And this Python code above needs the env variable "GOOGLE_APPLICATION_CREDENTIALS" which holds the service Account key (ConfigMap in this case). To achieve this all the env variables must be printed out to "/etc/environment" file of the container on the runtime. Something like this:
FROM ubuntu:latest
ADD send.py Jelly/send.py
COPY jellycron /etc/cron.d/jellycron
RUN apt-get update && apt-get install -y cron vim mysql-server curl python python-pip
&& pip install --upgrade pymysql google-api-python-client google-cloud google-
cloud-pubsub && touch /var/log/cron.log && chmod 0644 /etc/cron.d/jellycron &&
crontab /etc/cron.d/jellycron
CMD printenv >> /etc/environment && cron && tail -f /var/log/cron.log