Recently, I have been writing on log management tools and techniques. Very recently, I am even evaluating Grafana Loki on-premise. I would write a review in few days regarding Grafana Loki. As of now from server hardware requirements, log volume ingestion standpoint Grafana seems excellent compared with ELK stack and GrayLog.
This blog post is a general blog post. For proper log management, we need different components.
Log ingestion client
Log ingestion server
Log Viewer
Some kind of long-term archiver that can restore certain logs on required basis (Optional)
Log Ingestion Client:
FluentD is the best log ingestion client for several reasons. Every log ingestion stack have their own log ingestion clients. ELK Stack has LogBeats, MetricBeats etc… GrayLog does not have a client of its own but supports log ingestion via Gelf / RSysLog etc… Grafana Loki has PromTail.
FluentD can collect logs from various sources and ingest into various destinations. Here is the best part – multiple destinations based on rules. For example certain logs can be ingested into Log servers and uploaded to S3. Very easy to configure and customize and there are plenty of plugins for sources, destinations and even customizing logs such as adding tags, extracting values etc… Here is a list of plugins.
FluentD can ingest into Grafana Loki, ELK stack, GrayLog and much more. If you use FluentD, if the target needs to be changed, its just a matter of configuration.
Log Ingestion Server:
ELK vs GrayLog vs Grafana Loki vs Seq and several others. As of now, I have evaluated ELK, GrayLog and Grafana Loki.
Log Viewer:
Grafana front end with Loki backend, GrayLog, Kibana frontend with ElasticSearch backend in ELK stack.
Long-Term Archiving:
ELK stack has lifecycle rules for backing up and restoring. GrayLog can be configured to close indexes and re-open on a necessary basis. Grafana Loki has retention and compactor settings. However, I have not figured out how to re-open compacted gz files on a necessity basis.
Apart from these, I am using Graphite for metrics. I do have plans for ingesting additional metrics. As of now, I am using the excellent hosted solution provided by Grafana. As of now, in the near-term I don’t have plans for self-hosting metrics. But Grafana front-end supports several data sources.
I am thinking of collecting certain extra metrics without overloading the application (might be an after-thought or might not be). I am collecting NGinx logs in json format. The URL, upstream connect, upstream response time are being logged. Now, by parsing these logs, the name of the ASP.Net MVC controller, name of the Action Method, the HTTP verb can be captured. Now, I can use these as metrics. I can very easily add metrics at the database layer in the application. With these metrics, I can easily identify bottlenecks, slow performing methods and even monitor average response times etc… and set alerts.
The next few days or weeks would be about the custom metric collection based on logs. You can expect few blog posts on some FluentD configuration, C# code etc… FluentD does have some plugins for collecting certain metrics but we will look into some C# code for parsing, sending metrics into Graphite.
Here is a screenshot from the self-hosted Grafana front-end for Loki logs:
Here is a screenshot from Grafana.com hosted showing Graphite metrics
I am hoping this blog posts helps someone. Some C# code for working with Logs, Metrics and Graphite over the next few days / weeks.
This is a almost complete article for ELK stack implementation. However, the authorization restrictions in Kibana are a bit tricky, this article shows authorization at the webserver level for Apache (useful for smaller companies, for fine-grained permissions this might not be useful) i.e This article would serve the purpose of installing the above mentioned software stack. If later I come across anything different or useful when it comes to installing this article would be updated.
This is more like a step by step end to end tutorial, combining information from a lot of different sources. All the appropriate references are provided.
The actual log ingestion, monitoring etc… might be seperate articles.
This is for Ubuntu 20.04. I would suggest at least 4GB RAM. Based upon your requirements follow all or some of the steps
Steps:
Update
2. Install Java
3. Install ElasticSearch
4. Minimal configuration of ElasticSearch
5. Attach a seperate data volume to EC2 instance in AWS (Optional)
6. Start and verify ElasticSearch
7. Installing Kibana
8. Installing NGinx (Optional if NGinx is installed)
9. Installing Apache and securing Apache (Optional if you have a different web server and secured in a different way)
9a) Securing using Auth0 (My preferred way due to some undisclosed reasons)
10. Install LetsEncrypt’s free SSL certificate for NGinx (Must, unless you have different form of SSL certificates)
11. Install LetsEncrypt’s free SSL certificate for Apache (Must, unless you have different form of SSL certificates)
12. Install Dex (Optional, configuring Dex is not covered in this article)
ElasticSearch stores configuration in a file located at /etc/elasticsearch/elasticsearch.yml, for now we would uncomment network.host and set to localhost.
sudo nano /etc/elasticsearch/elasticsearch.yml
// uncomment network.host as shown below, press ctrl + x, Y + Enter i.e save the file
5. Attach a seperate data volume to EC2 instance in AWS (Optional)
Goto AWS Console, EC2 and click Volumes.
Then click Create Volume in the top right.
Select the appropriate volume type, size etc… and create volume
Once the volume is created and available, select the volume and click “Attach Volume” from the “Actions” menu.
Select the instance for which the volume needs to be attached and click attach.
Now SSH into the EC2 instance
lsblk
This should show something like this:
nvme1n1 was attached.
Format the newly attached volume
sudo mkfs -t xfs /dev/nvme1n1
Mount to /etc/lib/elasticsearch
sudo mount /dev/nvme1n1 /var/lib/elasticsearch/
For the volume to be automatically mounted edit /etc/fstab. But prior, make a copy because it seems improper fstab configuration can cause problems.
sudo blkid
sudo nano /etc/fstab
Paste the following line by replacing XXX with your own UUID from previous step.
Enable port 80 in Security Group, in firewall (ufw) if you have and navigate to the public IP address of your computer and see if the NGinx page is displayed.
9. Installing Apache and securing Apache (Optional if you have a different web server and secured in a different way)
The next steps are optional, these steps are for securing the website at the server level i.e as a one person company, for now, I need to secure websites directly at the server level. If access rights are an issue, those need to be handled at the application level.
ZmartZone IAM. (n.d.-a). GitHub – zmartzone/mod_auth_openidc: OpenID Certified<sup>TM OpenID Connect Relying Party implementation for Apache HTTP Server 2.x. GitHub. Retrieved January 2, 2023, from <span>https://github.com/zmartzone/mod_auth_openidc
We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept All”, you consent to the use of ALL the cookies. However, you may visit "Cookie Settings" to provide a controlled consent.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie
Duration
Description
cookielawinfo-checkbox-advertisement
1 year
Set by the GDPR Cookie Consent plugin, this cookie is used to record the user consent for the cookies in the "Advertisement" category .
cookielawinfo-checkbox-analytics
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional
11 months
The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy
11 months
The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.