: Is there a modern (eg NoSQL) web analytics solution based on log files? I have been using Awstats for many years to process my log files. But I am missing many possibilities (like cross-domain
I have been using Awstats for many years to process my log files. But I am missing many possibilities (like cross-domain reports) and I hate being stuck with extra fields I created years ago. Anyway, I am not going to continue to use this script.
Is there a modern apache logs analytics solution based on modern storage technologies like NoSQL or at least somehow ready to cope with large datasets efficiently?
I am primarily looking for something that generates nice sortable and searchable outputs with the focus on web analytics, before having to write my own frontends. (so graylog2 is not an option)
This question is purely about log file based solutions.
More posts by @XinRu657
2 Comments
Sorted by latest first Latest Oldest Best
In case someone else comes along this problem:
My solution for the moment was to use Goaccess on the log files or on subsets using cat and grep. It may seem as a regression compared to awstats or any software with some result storage, but its performance is great for ad-hoc log file analysis.
Splunk can import Apache logs docs.splunk.com/Documentation/Splunk/4.3.2/Data/Apachelogslocal
Or review this answer on Stack Overflow
A relatively recent option is to use Flume to collect the logs
and use the MongoDB sink plugin for Flume to write the events to
MongoDB.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.