How To Use AWS CloudWatch With On-Premise Application Components?

AWS CloudWatch offers centralized logging, monitoring, and analysis to make the developer’s job easier. A question that comes up for the Enterprise Applications that follow a hybrid cloud deployment (that is, one or more of the application components reside on-premise) is how can we use CloudWatch logs for the on-premise components? Is it even possible? The short answer is yes. And, we will see in this post how to do that.

Why Use CloudWatch For On-Premise Components?

There are several benefits of using CloudWatch for On-Premise components.

  • Leverage from centralized logging: You can use the same capabilities for storing the logs centrally for the on-premise components that you are using for the rest of your Cloud-based components.
  • Time Conversion: CloudWatch logs are stored in UTC. So, you do not have to worry about tedious conversions, which could often take up your precious time when analyzing logs across several components, especially if these are spread across geographically.
  • Consistent Analysis: You can use the same tools and techniques that you are using for Cloud-based components.
  • Avoid issues like logs rollover or difficult to access logs. Often, on-premise components are managed by customers or other clients that may require some coordination and effort.

How To Publish To CloudWatch Logs From On-Premise Components?

There are a couple of approaches to accomplish this.

  1. Using the AWS CloudWatch Agent to publish logs: This can be extremely useful for on-premise components that follow an appliance model for deployment (such as a pre-baked image with the application components and dependencies). So, this approach is more configuration-centric and should not require code-level changes. Apart from log collection, the CloudWatch agent can also help in capturing system metrics (such as CPU and memory utilization).
  2. Using the CloudWatch Logs API to publish logs: This approach requires enhancing the code to use the CloudWatch Logs API to publish logs. Of course, you can make it a reusable module or consider using a third-party library. But, the point is this is a code-centric approach that offers more flexibility.

In this post, we will be talking about using the CloudWatch Logs API. If you are interested in using the CloudWatch Agent, please refer to the AWS CloudWatch Agent documentation for details.

Using The CloudWatch Logs API To Publish Logs

The following code shows the CloudWatch Logs API usage.

package com.cloudnineapps.samples.aws;

import java.util.ArrayList;
import java.util.List;

import com.amazonaws.auth.AWSCredentialsProvider;
import com.amazonaws.auth.DefaultAWSCredentialsProviderChain;
import com.amazonaws.services.logs.AWSLogsClient;
import com.amazonaws.services.logs.AWSLogsClientBuilder;
import com.amazonaws.services.logs.model.CreateLogGroupRequest;
import com.amazonaws.services.logs.model.CreateLogStreamRequest;
import com.amazonaws.services.logs.model.DescribeLogGroupsRequest;
import com.amazonaws.services.logs.model.DescribeLogGroupsResult;
import com.amazonaws.services.logs.model.DescribeLogStreamsRequest;
import com.amazonaws.services.logs.model.DescribeLogStreamsResult;
import com.amazonaws.services.logs.model.InputLogEvent;
import com.amazonaws.services.logs.model.PutLogEventsRequest;
import com.amazonaws.services.logs.model.PutRetentionPolicyRequest;

/**
 * Sample client for AWS CloudWatch Logs API.
 */
public class AWSCloudWatchLogsSampleClient {

	/** The log group name. */
	private static final String LOG_GROUP = "/myapp/onprem/component-1";

	/** The log stream name. */
	private static final String LOG_STREAM = "app-log";
	
	/** The log retention period (in days). */
	private static final int LOG_RETENTION_PERIOD = 1;

	/** The AWS region. */
	private static String Region = "us-east-1";
	
	/** The CloudWatch client. */
	private static AWSLogsClient Client;
	
	
	/** Opens the CloudWatch log. */
	public static void openCloudWatchLog() throws Exception {
		AWSCredentialsProvider creds = new DefaultAWSCredentialsProviderChain();
		Client = (AWSLogsClient) AWSLogsClientBuilder.standard()
				     .withCredentials(creds)
				     .withRegion(Region)
				     .build();
		// Create and set up the log group if it doesn't exist
		DescribeLogGroupsRequest request = new DescribeLogGroupsRequest().withLogGroupNamePrefix(LOG_GROUP);
		DescribeLogGroupsResult result = Client.describeLogGroups(request);
		if (result.getLogGroups().isEmpty()) {
			CreateLogGroupRequest logGroupRequest = new CreateLogGroupRequest(LOG_GROUP);
			Client.createLogGroup(logGroupRequest);
			PutRetentionPolicyRequest policyRequest = new PutRetentionPolicyRequest(LOG_GROUP, LOG_RETENTION_PERIOD);
			Client.putRetentionPolicy(policyRequest);
			CreateLogStreamRequest logStreamRequest = new CreateLogStreamRequest(LOG_GROUP, LOG_STREAM);
			Client.createLogStream(logStreamRequest);
			log("Created the log group and the log stream.");
		}
	}
	
	/** Logs the specified message. */
	public static void log(String msg) throws Exception {
		// Retrieve the sequence token in the log stream
		DescribeLogStreamsRequest request = new DescribeLogStreamsRequest().withLogGroupName(LOG_GROUP).withLogStreamNamePrefix(LOG_STREAM);
		DescribeLogStreamsResult result = Client.describeLogStreams(request);
		String seqToken = result.getLogStreams().get(0).getUploadSequenceToken();

		// Write to the log stream
		List<InputLogEvent> logEvents = new ArrayList<InputLogEvent>();
		InputLogEvent logEvent = new InputLogEvent().withMessage(msg).withTimestamp(System.currentTimeMillis());
		logEvents.add(logEvent);
		PutLogEventsRequest logRequest = new PutLogEventsRequest(LOG_GROUP, LOG_STREAM, logEvents).withSequenceToken(seqToken);
		Client.putLogEvents(logRequest);
	}
		
	/** Main */
	public static void main(String[] args) throws Exception {
		System.out.println("Launching the application...");
		openCloudWatchLog();
		// Sample log statements
		log("Starting the app...");
		log("Another message");
		System.out.println("Execution completed.");
	}
}

Let’s walk through the code. You can check out the Resources section for the complete code (including the maven pom that can be used to compile and execute).

  • The main() method invokes the openCloudWatchLog() method to initialize the CloudWatch Logs SDK client.
  • The openCloudWatchLog() method checks if the required Log Group (/myapp/onprem/component-1) exists using the DescribeLogGroupsRequest and the Client.describeLogGroups() call. If not, it creates the Log Group using the CreateLogGroupRequest and the Client.createLogGroup() call. We should always ensure that an appropriate log retention period is set on the log group to avoid accruing a huge log that can lead to a high cost. This is accomplished using the PutRetentionPolicyRequest and the Client.putRetentionPolicy() call. Then, we create the Log Stream using the CreateLogStreamRequest and the Client.createLogStream() call. The following screenshot shows the Log Group in the CloudWatch Console.
  • Here is a screenshot of the Log Stream under the Log Group.
  • Next, the code uses the log() method to log sample messages. It uses DescribeLogStreamsRequest and the Client.describeLogStreams() call to retrieve the Log Stream and fetch the upload sequence token. This token must be included when publishing logs (except for the very first publish). Then, we are creating an InputLogEvent with the supplied message and timestamp. The log is published using the PutLogEventsRequest and the Client.putLogEvents() call. As you might have noticed, you do not have to publish individual log statements. You could very well add multiple InputLogEvent objects to publish a batch of logs. The following screenshot demonstrates a sample run of the code.

Using IAM Policy To Restrict Access To Specific Log

As part of such a setup, it is important to use restrictive access so that the on-premise component can only access the specific log. The good thing is you can enforce this using IAM as follows.

  • Create one or more IAM users for the on-premise components. Grant these users programmatic access only.
  • Create a custom policy (or assign an inline policy) like the one shown below and assign it to the above IAM user(s).
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "CloudWatchLogGroupQueryAccess",
            "Effect": "Allow",
            "Action": [
                "logs:DescribeLogGroups"
            ],
            "Resource": "arn:aws:logs:*:*:*"
        },
        {
            "Sid": "CloudWatchLogsAccess",
            "Effect": "Allow",
            "Action": [
                "logs:CreateLogGroup",
                "logs:PutRetentionPolicy",
                "logs:DescribeLogStreams",
                "logs:CreateLogStream",
                "logs:PutLogEvents"
            ],
            "Resource": "arn:aws:logs:*:*:log-group:/myapp/onprem/*:log-stream:*"
        }
    ]
}

This policy grants access to the required CloudWatch Logs API calls on the on-premise logs only.

Best Practices For Publishing Logs From On-Premise Components

The following are some key best practices to consider.

  • Logging can easily get very network intensive. Hence, be quite judicious about which logs are sent to CloudWatch logs. For example, ERROR and WARNING logs are good candidates, but DEBUG is not typically.
  • Avoid logging any sensitive data. This is critical, and often not as well thought. For example, some of the things to avoid are logging passwords in plain text, users’ Personally Identifiable Information (PII), and so on.
  • Always set an appropriate log retention period on the Log Group.
  • Use a well-defined naming convention for the Log Group and Log Streams. For example, /myapp/onprem/component-1.
  • Use a restrictive IAM policy for the user that is used to publish logs, and ensure it has access to the component-specific logs only.
  • Prefer using application-specific IAM user(s) for logging across multiple applications. This way, you can track and manage access better.

Conclusion

When designing hybrid cloud or on-premise components, evaluate publishing logs to CloudWatch. By following a few key best practices to ensure this is done in a manner that meets the application needs as well as enterprise readiness considerations like security and performance, this can be quite helpful in pro-active application monitoring and management.

Resources

 

Happy logging!
– Nitin

If you liked this post, you will find my AWS Advanced For Developers course helpful that focuses on many such best practices and techniques to design and deploy real-world applications in AWS.

Enhance your AWS skills with these hands-on courses for real-world deployments.

Learn CloudFormation from concepts to hands-on examples.

Learn practical application development on AWS.

 


Also published on Medium.

Leave a Reply

Your email address will not be published. Required fields are marked *