Back to Blog Landing

AI & Data Analytics | Application Log Data in Context

Amena Siddiqi October 25, 2018

Application log messages are a crucial source of diagnostic information for applications teams, but Splunk and other log analytics solutions are an expensive and inefficient way to capture and process this data. Integrating log analysis fully into application performance monitoring (APM) means you can search across log data, see which methods they were generated by in the context of the transaction call stack and, of course, identify every business transaction and user that was impacted.

Application context is key for log analytics

When log data is gathered through log scraping or other methods by log management tools dedicated to this task, it is collected without the context of the code it was generated by, and the business transaction it relates to. To troubleshoot errors or exception handling using this log data, performance investigators would probably have to use three or four different tools, and may even have to manually open up log files. They would have to visually scan and time correlate a log message with an APM tool that’s monitoring HTTP requests to see which transaction it is related to.

Unlike traditional log analytics tools, that collect log messages after the application has written them to disk with standard app libraries, Aternity APM intercepts the log message in memory before it is written to a file. With Aternity APM, there is no need to specially configure Splunk or other log collector to capture these messages.

capture logs

In addition, the Aternity APM log data is presented in context of the end user transaction so you can clearly see who was affected, and when, and what they were trying to do. Because it captures all the parameters in a web request, if the problem is, for example, with a shopping cart “checkout” transaction, Aternity APM will not only identify the “checkout” transaction, but will also capture the contents of the shopping cart and provide a detailed breakdown of performance for every “checkout” request, whether it was successful or not.

“99% of our log data is rubbish”

Here’s a true story from one of our on-site engineers working with a new client. Prior to implementing Aternity APM, the developers and architects at the client site were using ELK to look at web logs, application logs and database logs. On the application estate, the log files were generating over 30GB of logs per instance per day and over 200GB per day on each database and web node. The environment was shared with other application teams and there were literally terabytes of logs flying across the network from all angles. This resulted in low data retention, making it impossible to go back in time in order to perform forensic analysis—they were lucky to get a day’s worth at best! They told us, “99% of our logs are rubbish and nothing to do with what we are looking for or need. It’s typical that the one percent of useful data we are looking for has just rolled out.”application log analysis

With Aternity APM they were able to intercept the same log message that Kibana was ingesting, with the additional benefit that the messages were attached to the actual end user transaction itself. And the best part? The AppInternals data, being transaction-driven, focused only on the useful “one percent.”

Live, log and prosper

Application logs generate a significant volume of data. Based on what we’ve seen at our customer sites, they can easily represent well over half of all log data collected. Since typical log analytics tools license based on log volume processed this can become very expensive very quickly.

With Aternity APM, application log analysis and storage is part of the core APM functionality that comes at a fixed license cost independent of the amount of data processed. Aternity APM collects all transactions along with their user metadata and application log messages, and indexes them in its big data store. Taking a customer example as a benchmark, we were able to store about 40 days worth of raw transactional data on a 4TB disk as result of our efficient storage data structure. This presents a much more cost-effective solution for application performance analysis.

If you’re using ELK or Splunk for log analysis related to APM, ask yourself:

  • How much disk you are using?
  • How far back in time is your data retention?
  • What is the percentage of log lines that actually matter?
  • How much network overhead is this causing?

Conclusion

Aternity APM captures all the important details of every single user-driven transaction and a whole lot more. And the icing on the cake is, we capture the exact same log message that you would expect from a log analysis tool, except our log entries are rich in value (not noise) and are attached to the transaction itself.

If you missed the other blogs in this series on AI and Data Analytics, you can find them below.

instant access trial

To learn more about the Aternity APM product, read  our whitepaper on Why Big Data is Critical for APM or try it out for yourself in our sandbox.

 

You may also like

Blog2
AI & Data Analytics | Finding the Needle in the Haystack

From chess playing robots to robot assassins from the future, the potential of artificial intelligence (AI) has been thrilling our

Read More
Analytics_Blog_3
AI & Data Analytics | Extracting Business and App Intelligence

Application environments are changing, the way we build applications and the way we deploy them is changing. Microservices deployed in

Read More
Analytics_Blog
AI and Data Analytics | What’s More Valuable Than Big Data?

“The most valuable commodity I know of is information.” - Gordon Gekko (Wall Street) It’s no secret that data analytics and artificial

Read More