I appear to be missing some configuration somewhere but have no idea where.
My app (golang) is using stackdriver logging. When I run locally it works fine and my log messages show up in Stackdriver. When I run in GKE then my custom logger messages do not show up.
Any standard out messages (fmt.println()) will show up in stack driver. They just wont' ahve proper severity and would rather us the logging API. What could I have misconfigured in my GKE instance to make google logging not show up?
Sorry for the vague post but I don't have much to go on, not getting any errors.
Here is my code for sending messages if it helps.
func logMessage(message string, transactionID string, severity logging.Severity) {
ctx := context.Background()
// Creates a client.
client, err := logging.NewClient(ctx, loggingData.ProjectID)
if err != nil {
log.Fatalf("Failed to create client: %v", err)
}
// Selects the log to write to.
logger := client.Logger(loggingData.LogName)
logger.Log(logging.Entry{Payload: message, InsertID: transactionID, Severity: severity})
// Closes the client and flushes the buffer to the Stackdriver Logging
// service.
if err := client.Close(); err != nil {
log.Panicln("Failed to close client: \n", err.Error())
return
}
return
}
Update:
I got my SSH working into the Node (VM Instance) and confirmed that it looks like FluentD is working and picking up changes. I opened the log files and only saw things from fmt.Println and nothing from the golang cloud Logger.
Maybe I am not understanding how Google Stackdriver Logging (https://godoc.org/cloud.google.com/go/logging) is supposed to work? Clearly I am missing something and just not sure what yet.
Thanks
I had similar issue with golang app running in Cloud Run
Was able to find stackdriver logs in Logs Viewer web UI by filter resource.type = "project"
On GKE, fluentd agent is used and it's included in the VM image (node). His role is to watch changes to Docker log files that live in the directory /var/lib/docker/containers/
and are symbolically linked to from the /var/log/containers
directory using names that capture the pod name and container name. These logs are then submitted to Google Cloud Logging which assumes the installation of the cloud-logging plug-in.
You can customize your agent configuration for Streaming logs from additional inputs such as Streaming unstructured (text) or structured (JSON) logs via log files.
By default fluentd extract local_resource_id from tag for 'k8s_container' monitored:
The resource. The format is:
'k8s_container.<namespace_name>.<pod_name>.<container_name>'.
The fluentd agent rename the field 'log' to a more generic field 'message'. This way the fluent-plugin-google-cloud knows to flatten the field as textPayload instead of jsonPayload after extracting 'time', 'severity' and 'stream' from the record.
If 'severity' is not set, assume stderr is ERROR and stdout is INFO.
The agent can be enabled when creating the cluster, then the default fluentd pod will created.
You can as well perform a manual installation of stackdriver-logging-agent in GKE (fluentd).
First I will suggest you to find if you have an agent running.
To do, SSH to your node and check if the agent is running, run the below command line
ps ax | grep fluentd
Output example:
2284 ? Sl 0:00 /opt/google-fluentd/embedded/bin/ruby /usr/sbin/google-fluentd [...]
2287 ? Sl 42:44 /opt/google-fluentd/embedded/bin/ruby /usr/sbin/google-fluentd [...]
Do a test by running the below command line :
logger "Some test message"
Check stackdriver logging for your test message