Stackdriver logs not showing up in GKE

4/10/2019

I am unable to see log messages that are sent from my GKE clusters using Golang. They work fine when running locally but not from the container running in GKE. Clearly something is misconfigured in GKE but I don't see any errors but not really sure where to look. Any insight or places to check would be very useful.

Below is my code and my cluster scopes (if it helps).

Thanks.

Scopes:

oauthScopes:
- https://www.googleapis.com/auth/cloud-platform
- https://www.googleapis.com/auth/compute
- https://www.googleapis.com/auth/datastore
- https://www.googleapis.com/auth/devstorage.full_control
- https://www.googleapis.com/auth/devstorage.read_only
- https://www.googleapis.com/auth/logging.write
- https://www.googleapis.com/auth/monitoring
- https://www.googleapis.com/auth/monitoring.write
- https://www.googleapis.com/auth/pubsub
- https://www.googleapis.com/auth/service.management.readonly
- https://www.googleapis.com/auth/servicecontrol
- https://www.googleapis.com/auth/source.full_control
- https://www.googleapis.com/auth/sqlservice.admin
- https://www.googleapis.com/auth/trace.append

Code:

func LogMessage(logLevel ReddiyoLoggingSeverity, message, domain, transactionID string) {

    ctx := context.Background()
    // Creates a client.
    client, err := logging.NewClient(ctx, loggingData.ProjectID)
    if err != nil {
        log.Fatalf("Failed to create client: %v", err)
    }

    // Selects the log to write to.
    logger := client.Logger(loggingData.LogName)

    labels := make(map[string]string)
    labels["transactionID"] = transactionID
    labels["domain"] = domain

    var logSeverity logging.Severity
    switch logLevel {
    case debug:
        logSeverity = logging.Debug
    case info:
        logSeverity = logging.Info
    case warning:
        logSeverity = logging.Warning
    case reddiyoError:
        logSeverity = logging.Error
    case critical:
        logSeverity = logging.Critical
    case emergency:
        logSeverity = logging.Emergency
    default:
        logSeverity = logging.Warning
    }
    logger.Log(logging.Entry{
        Payload:  message,
        Severity: logSeverity,
        Labels:   labels})
    // Closes the client and flushes the buffer to the Stackdriver Logging
    // service.
    if err := client.Close(); err != nil {
        log.Fatalf("Failed to close client: %v", err)
    }
}
-- mornindew
go
google-cloud-stackdriver
google-kubernetes-engine
kubernetes
logging

2 Answers

4/15/2019

See here:

https://cloud.google.com/kubernetes-engine/docs/tutorials/authenticating-to-cloud-platform

Your containerized client is unable to authenticate against the Cloud Platform Logging service.

You don't explain how you're authenticating when running the client locally but this mechanism needs to be reproduced on Kubernetes.

If you check the container's logs, these should confirm the failure to authenticate against the Logging service.

HTH!

-- DazWilkin
Source: StackOverflow

4/17/2019

So the solution was simplier than I was expecting. I don't fully understand it yet but appears to be how stackdriver works.

When I run locally then my logs show up under Google Project --> Project ID --> Log Name

When I run in GKE then it shows up in VM Instance --> Instance ID (or all Instance) --> Log Name

I would actually expect it to show up under google project all the time. Either it doesn't or I misconfigured Stackdriver.

-- mornindew
Source: StackOverflow