Add free search for your website. Sign up now! https://webveta.alightservices.com/
Categories
Logging Security

Some log management tips and a generic review of ELK Stack, GrayLog and Grafana

Centralized log management is very important for any tech company of any size. For larger companies, entire company logs need not be centralized but can be segmented based on department or product etc…

Background in the context of ALight Technology And Services Limited

ALight Technology And Services Limited is both product and service based company. Currently offers two completely free products – SimplePass and PodDB. With SimplePass, I am not worried much because except for the code there is no data on the server and obviously no customer specific data. With PodDB the risk is slightly higher because there is data but no customer specific data. As of now the AWS account, servers are very highly secured with immediate alerts on login into AWS console or servers, new EC2 instances, instance terminations etc… With the infrastructure, access to infrastructure being secured, the next step is external threats and being able to respond to external threats. These are very important steps prior to developing any products that would possibly contain customer data. What if someone tries to hack by sending malicious payload or DOS (Denial of Service) or DDOS (Distributed Denial of Service)? For identifying, mitigating, preventing such things it’s very important to have proper log management techniques, monitoring of metrics, proper alerts and proper action plan / business continuity plan when such incidents occur. Even if such a thing happened, it’s very important to have logs so that computer forensics can be performed. No company is going to offer free products for ever without generating revenue, in a similar way ALight Technology And Services Limited does have plans of developing revenue generating products or offer services such as architecting, development, hosting etc… Compared with modern days powerful hacking equipment of the anonymous group that calls them the “eyes” (don’t get confused with the intelligence “five eyes”, as a matter of fact the anonymous “eyes” are targeting the five countries that formed the “five eyes” and any whistleblowers like me in this context – I am the whistleblower (but not R&AW) of India’s R&AW equipment capabilities and the atrocities that have been done by the R&AW spies against me), the current state of information security standards are much below.

I have looked into 3 solutions and each of these solutions had strengths and benefits.

What I was looking for:

For example – PodDB has web server logs (NGinx), ASP.Net Core web application logs, and a bunch more of logs from microservice that interacts with the database, microservice that writes some trending data, microservices that queries solr etc… So my log sources are multiple and I want to aggregate all of these along with other logs such as syslog, mariadb audit log etc…

  1. AWS Cloudwatch:

CloudWatch allows easy ingestion, very high availability, metrics, alarms etc… 5GB per month of log ingestion for free. However, live tailing of the logs i.e being able to see logs as they soon as they are ingested is a bit problematic. Even querying / viewing across log groups is a bit problematic. The strength is the definable retention period for each log group. Once ingested the logs cannot be modified, so definitely a great solution if storing logs for compliance reasons. AWS should consider introducing data storage tiers like S3 data storage i.e lifecycle transition – hot logs can be queried and definable period, then lifecycle transition and logs would be stored for archival purpose for some period and then deleted.

2. ELK Stack:

ELK stack consists of ElasticSearch, LogStash and Kibana. ElasticSearch for full-text search capabilities, LogStash for log ingestion, KIbana for visualization. This review is about the self-hosted version. The ELK stack has plenty of features and very easy management if the application and all of it’s components can be properly configured. Built-in support for logs, live tailing of logs, metrics etc… Easier management using ElasticAgents i.e ElasticAgents can be installed on multiple machines and what data should be ingested by each agent can be controlled by the web interface. However, ELK stack seemed a bit heavy in computing resource consumption and for whatever reason, LogStash crashed several times and the system crashed i.e the EC2 instance just hanged, couldn’t even restart. ELK Stack supports, hot and cold log storages i.e the past 15 – 30 days of logs can be kept in the hot storage and the older logs can be automatically moved into cold tier i.e not queried frequently but are kept for various reasons.

3. Graylog:

This is about self hosted version of Graylog. Graylog focuses only on log management. Very easy to setup and ingest logs. Easy querying of logs. No support for metrics. Graylog allows creating snapshots of older data which can be stored elsewhere, restored and searched on a necessity basis.

4. Grafana

This is about the free Grafana account. Grafana offers very generic 50GB log ingestion per month. Logs can be easily ingested into Loki and viewed from Grafana. Metrics can be ingested into Graphite and viewed. Very easy to setup alerts. I have not tried yet but the free tier has 50GB of traces ingestion per month. One of the very best features I liked about Grafana is easy way of tagging logs. If log sources are properly tagged, combining and viewing multiple log sources is very very easy.

Thank you Grafana for such a generous free tier and such a great product.

There seems to be no control of retention period. Grafana paid subscription has control of retention period. The paid version starts at $8 per month. I do have plans about signing up for paid account just before launching commercial products specifically for planning retention i.e either Grafana can store the older logs for few extra months on my behalf or if they can provide a solution to upload into S3 glacier and of course when needed being able to restore from S3 Glacier and being able to search, because storing old logs in S3 Glacier and if there is no way of restoring and searching then the entire purpose of storing old logs would not make sense.

Mr. Kanti Kalyan Arumilli

Arumilli Kanti Kalyan, Founder & CEO
Arumilli Kanti Kalyan, Founder & CEO

B.Tech, M.B.A

Facebook

LinkedIn

Threads

Instagram

Youtube

Founder & CEO, Lead Full-Stack .Net developer

ALight Technology And Services Limited

ALight Technologies USA Inc

Youtube

Facebook

LinkedIn

Phone / SMS / WhatsApp on the following 3 numbers:

+91-789-362-6688, +1-480-347-6849, +44-07718-273-964

kantikalyan@gmail.com, kantikalyan@outlook.com, admin@alightservices.com, kantikalyan.arumilli@alightservices.com, KArumilli2020@student.hult.edu, KantiKArumilli@outlook.com and 3 more rarely used email addresses – hardly once or twice a year.

Categories
Linux Security

Some important log management techniques on Linux – AuditD

In my continued pursuit of strengthening the security infrastructure at my own startup – ALight Technology And Services Limited, I have written few blog articles in the past regarding securing web applications, importance of audit, logs – part of the NIST Cyber Security Framework. This blog post talks about some things I have done on AWS infrastructure. While running a company with no other employees and while being the target of state-sponsored / state-trained hackers, I ended up learning a lot and now I have dabbled in pretty much everything in computing (expert at development, learning system administration, infosec etc… as part of running my own startup).

  1. I created a base Ubuntu image by enabling ufw, installed auditd, installed cloudwatch log agent, closing unnecessary ports, some custom alerters as soon as a SSH login happens etc… I call this AMI the golden AMI. I also update the golden AMI every few months. The advantage of using a golden AMI like this is any EC2 instance you would launch would have these in place.
  2. I am using ELK stack along with Cloudwatch logs and S3 for logs. ELK stack for log analysis i.e logs are stored for a shorter period, Cloudwatch logs for various other reasons, (can’t disclose) and finally S3 glacier for longer term retention.
  3. With the above mentioned setup, if an incident happens, all the necessary logs would in place for analysis.

I wanted to give a quick introduction to Cloudwatch log agent, AuditD as part of this blog post.

Cloudwatch log agent:

A small piece of software that ingests logs into AWS Cloudwatch. https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/QuickStartEC2Instance.html

The setup needs IAM role with proper permissions, more details are at the above mentioned link.

On Ubuntu the logs config is stored at:

/var/awslogs/etc/awslogs.conf

The configuration file is very simple and straightforward.

I would suggest ingesting all the ubuntu system logs along with auditd logs and create a golden AMI.

AuditD:

This is a nice audit tool for Linux capable of auditing a lot of things.

Installation:

sudo apt update
sudo apt-get install auditd
sudo systemctl enable auditd
sudo systemctl start auditd

The configuration and rules are stored at /etc/audit. The config file is auditd.conf, rules should be in audit.rules.

The configuration file is self-explanatory.

There are no default rules.

But thankfully there is a github repo with several rule templates for meeting several compliance standards such as PCI. The PCI rules are at: https://github.com/linux-audit/audit-userspace/blob/master/rules/30-pci-dss-v31.rules

Several rule files are located in the same repository:

https://github.com/linux-audit/audit-userspace/tree/master/rules

Stay safe & secure! Stay away from hac#?ers / ransom thieves.

Happy computing!

Mr. Kanti Kalyan Arumilli

Arumilli Kanti Kalyan, Founder & CEO
Arumilli Kanti Kalyan, Founder & CEO

B.Tech, M.B.A

Facebook

LinkedIn

Threads

Instagram

Youtube

Founder & CEO, Lead Full-Stack .Net developer

ALight Technology And Services Limited

ALight Technologies USA Inc

Youtube

Facebook

LinkedIn

Phone / SMS / WhatsApp on the following 3 numbers:

+91-789-362-6688, +1-480-347-6849, +44-07718-273-964

kantikalyan@gmail.com, kantikalyan@outlook.com, admin@alightservices.com, kantikalyan.arumilli@alightservices.com, KArumilli2020@student.hult.edu, KantiKArumilli@outlook.com and 3 more rarely used email addresses – hardly once or twice a year.

Categories
.Net C# gRPC

Metrics gathering using CollectD, C# over GRPC

In the past, I have written about a little known utility for collecting metrics known as CollectD. CollectD samples various metrics at configured interval and outputs to various destinations. This particular blog post is about having CollectD send the metrics to a GRPC endpoint, the endpoint can decide how to further process the received data. In this blog post, I would be writing about C# GRPC server for receiving data, but in reality most programming languages that support GRPC can be used.

One more thing, having CollectD use GRPC is slightly complex, because several different libraries need to be installed. Here is a list for Ubuntu, this is not an exhaustive list, but the list of libraries that I had to install on Ubuntu to allow CollectD report metrics using GRPC – gcc, gpp, build-essential, protobuf-compiler-grpc, libprotobuf-dev, protobuf-compiler, libgrpc++-dev. The best way to find any missing libraries is to compile CollectD from source as mentioned in https://collectd.org/download.shtml, and after ./configure look for missing libraries beside grpc until the output shows grpc – yes.

Now for the C# server, here is the .proto file, I have used:

syntax = "proto3";

package collectd;

import "google/protobuf/duration.proto";
import "google/protobuf/timestamp.proto";

service Collectd {
  rpc PutValues(stream PutValuesRequest) returns(PutValuesResponse);
}

message PutValuesRequest {
  ValueList value_list = 1;
}

message PutValuesResponse {}

message Identifier {
  string host = 1;
  string plugin = 2;
  string plugin_instance = 3;
  string type = 4;
  string type_instance = 5;
}

message Value {
  oneof value {
    uint64 counter = 1;
    double gauge = 2;
    int64 derive = 3;
    uint64 absolute = 4;
  };
}

message ValueList {
  repeated Value values = 1;

  google.protobuf.Timestamp time = 2;
  google.protobuf.Duration interval = 3;

  Identifier identifier = 4;

  repeated string ds_names = 5;
}

The .proto file is largely based on the proto files available at https://github.com/sandtable/collectd-grpc.

The implementation for the C# server is very simple. I have set the protobuf compiler to only generate the server side code. Create class that inherits from CollectdBase. Override the method PutValues. Remember the request is a stream.

public override async Task<PutValuesResponse> PutValues(IAsyncStreamReader<PutValuesRequest> requestStream, ServerCallContext context)
{
    while (await requestStream.MoveNext())
    {
        var currentItem = requestStream.Current;
        //Do something with currentItem
    }

    return new PutValuesResponse();
}

Mr. Kanti Kalyan Arumilli

Arumilli Kanti Kalyan, Founder & CEO
Arumilli Kanti Kalyan, Founder & CEO

B.Tech, M.B.A

Facebook

LinkedIn

Threads

Instagram

Youtube

Founder & CEO, Lead Full-Stack .Net developer

ALight Technology And Services Limited

ALight Technologies USA Inc

Youtube

Facebook

LinkedIn

Phone / SMS / WhatsApp on the following 3 numbers:

+91-789-362-6688, +1-480-347-6849, +44-07718-273-964

kantikalyan@gmail.com, kantikalyan@outlook.com, admin@alightservices.com, kantikalyan.arumilli@alightservices.com, KArumilli2020@student.hult.edu, KantiKArumilli@outlook.com and 3 more rarely used email addresses – hardly once or twice a year.

Categories
.Net ASP.Net NLog

Some NLog configuration examples for writing into different log management systems

This blog post is going to discuss few different useful configurations for writing logs using NLog.

  1. Writing directly into AWS CloudWatch

This can be accomplished by adding the NLog.AWS.Logger nuget package. Nuget link. Link for documentation / github.

Although, NLog.AWS.Logger supports other logging frameworks such as Log4Net, Serilog, those are out of context for the current blog post.

In nlog.config enable the extension:

<extensions>
    <add assembly="NLog.AWS.Logger" />
  </extensions>

Define the logger as a target:

<targets>
    <target name="aws" type="AWSTarget" logGroup="NLog.ConfigExample" region="us-east-1"/>
</targets>

2. Writing logs into ElasticSearch

Use NLog.Targets.ElasticSearch nuget package. Nuget URL. Github / documentation link.

Enable the extension:

<extensions>
    <add assembly="NLog.Targets.ElasticSearch"/>
  </extensions>

Define the target, documentation says preferred to wrap in a buffering wrapper:

<targets>
    <target name="elastic" xsi:type="BufferingWrapper" flushTimeout="5000">
      <target xsi:type="ElasticSearch" uri="http://localhost:9200/" />
    </target>
  </targets>

*Instead of hardcoing IP address or “localhost”, I would say use some name such as “elasticsearch” or “kibana” and then use the HOSTS file for mapping to the actual server. Then even if you have several applications on the same server and if the elasticsearch server gets changed, you don’t have to edit all the config files, you can edit just the hosts file. hosts file is located at /etc/hosts on Linux and C:\Windows\System32\drivers\etc\hosts on Windows.

Now we will discuss about 4 different interesting wrappers:

  1. Buffering Wrapper
  2. Async Wrapper
  3. AspNetBuffering Wrapper
  4. FallbackGroup Wrapper

These 4 loggers are wrappers i.e these loggers don’t write logs directly. Instead they are used to wrap other loggers by providing some interesting functionality that can be used to take advantage based upon necessity and use-case.

  1. Buffering Wrapper

Buffers log events and sends in batches.

As mentioned above in the ElasticSearch example, the wrapper would buffer messages and sends in batches.

Documentation

There is a very interesting use-case by using AutoFlushWrapper with BufferingWrapper and the actual target that writes the logs, such as writing the logs only when error happen.

2. Async Wrapper

When you don’t need buffering but at the same time if you don’t want your application to wait until logging is done, this could be useful.

Documentation

3. AspNetBuffering Wrapper

Waits until the completion of ASP.Net request and then sends the logs to the wrapped target.

Documentation

4. FallbackGroup Wrapper

This wrapper can be used for wrapping around multiple targets. For example ElasticSearch followed by Cloudwatch followed by File. i.e if the logger is unable to write to ElasticSearch, it would write to Cloudwatch, if that too failed it would write the logs into file.

Documentation

Mr. Kanti Kalyan Arumilli

Arumilli Kanti Kalyan, Founder & CEO
Arumilli Kanti Kalyan, Founder & CEO

B.Tech, M.B.A

Facebook

LinkedIn

Threads

Instagram

Youtube

Founder & CEO, Lead Full-Stack .Net developer

ALight Technology And Services Limited

ALight Technologies USA Inc

Youtube

Facebook

LinkedIn

Phone / SMS / WhatsApp on the following 3 numbers:

+91-789-362-6688, +1-480-347-6849, +44-07718-273-964

kantikalyan@gmail.com, kantikalyan@outlook.com, admin@alightservices.com, kantikalyan.arumilli@alightservices.com, KArumilli2020@student.hult.edu, KantiKArumilli@outlook.com and 3 more rarely used email addresses – hardly once or twice a year.

Categories
Security

Some tips for securing public facing internal applications

People who have vast experience in I.T know that security is of utmost importance and needs to be implemented in layers. I had a need to secure my Kibana implementation and I want to thwart hackers. I had two options:

  1. Use VPN
  2. Secure the website

Now, the problem very few VPN’s like Cisco AnyConnect support biometric authentication, ElasticSearch/Kibana’s security options are very less in the self-hosted version.

Thanks to Apache web server for the resuce. Apache web server has this plugin known as mod_auth_oidc, this plugin can be used at the web server level i.e the web server takes care of authorizing users. Kibana is hosted at https://kibana.alightservices.com.

I think this is a very great feature and everyone must use wherever possible for public-facing web applications that would be consumed by OAUTH2 or OpenID.

Moreover this plugin can easily enable SSO (SingleSignOn) features and all of this with just some basic configuration.

Thank you Apache Foundation and ZmartZone.

Mr. Kanti Kalyan Arumilli

Arumilli Kanti Kalyan, Founder & CEO
Arumilli Kanti Kalyan, Founder & CEO

B.Tech, M.B.A

Facebook

LinkedIn

Threads

Instagram

Youtube

Founder & CEO, Lead Full-Stack .Net developer

ALight Technology And Services Limited

ALight Technologies USA Inc

Youtube

Facebook

LinkedIn

Phone / SMS / WhatsApp on the following 3 numbers:

+91-789-362-6688, +1-480-347-6849, +44-07718-273-964

kantikalyan@gmail.com, kantikalyan@outlook.com, admin@alightservices.com, kantikalyan.arumilli@alightservices.com, KArumilli2020@student.hult.edu, KantiKArumilli@outlook.com and 3 more rarely used email addresses – hardly once or twice a year.

Categories
ElasticSearch ELK Logging NGinx

How to install ElasticSearch, Kibana, Logstash, Filebeat Let’sEncrypt SSL Certificate and secure the login

This is a almost complete article for ELK stack implementation. However, the authorization restrictions in Kibana are a bit tricky, this article shows authorization at the webserver level for Apache (useful for smaller companies, for fine-grained permissions this might not be useful) i.e This article would serve the purpose of installing the above mentioned software stack. If later I come across anything different or useful when it comes to installing this article would be updated.

This is more like a step by step end to end tutorial, combining information from a lot of different sources. All the appropriate references are provided.

The actual log ingestion, monitoring etc… might be seperate articles.

This is for Ubuntu 20.04. I would suggest at least 4GB RAM. Based upon your requirements follow all or some of the steps

Steps:

  1. Update

2. Install Java

3. Install ElasticSearch

4. Minimal configuration of ElasticSearch

5. Attach a seperate data volume to EC2 instance in AWS (Optional)

6. Start and verify ElasticSearch

7. Installing Kibana

8. Installing NGinx (Optional if NGinx is installed)

9. Installing Apache and securing Apache (Optional if you have a different web server and secured in a different way)

9a) Securing using Auth0 (My preferred way due to some undisclosed reasons)

10. Install LetsEncrypt’s free SSL certificate for NGinx (Must, unless you have different form of SSL certificates)

11. Install LetsEncrypt’s free SSL certificate for Apache (Must, unless you have different form of SSL certificates)

12. Install Dex (Optional, configuring Dex is not covered in this article)

13. Configure Apache reverseproxy

14. Configure NGinx as a reverseproxy

  1. Update:
sudo apt update

sudo apt upgrade

2. Install Java

sudo apt install default-jre

3. Install ElasticSearch

curl -fsSL https://artifacts.elastic.co/GPG-KEY-elasticsearch |sudo gpg --dearmor -o /usr/share/keyrings/elastic.gpg

echo "deb [signed-by=/usr/share/keyrings/elastic.gpg] https://artifacts.elastic.co/packages/7.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-7.x.list

sudo apt update

sudo apt install elasticsearch

4. Minimal configuration of ElasticSearch

ElasticSearch stores configuration in a file located at /etc/elasticsearch/elasticsearch.yml, for now we would uncomment network.host and set to localhost.

sudo nano /etc/elasticsearch/elasticsearch.yml

// uncomment network.host as shown below, press ctrl + x, Y + Enter i.e save the file
/etc/elasticsearch/elasticsearch.yml

5. Attach a seperate data volume to EC2 instance in AWS (Optional)

Goto AWS Console, EC2 and click Volumes.

AWS Console -> EC2 -> Volumes

Then click Create Volume in the top right.

Create Volume

Select the appropriate volume type, size etc… and create volume

Create Volume

Once the volume is created and available, select the volume and click “Attach Volume” from the “Actions” menu.

Attach Volume

Select the instance for which the volume needs to be attached and click attach.

Attach Volume

Now SSH into the EC2 instance

lsblk

This should show something like this:

lsblk output

nvme1n1 was attached.

Format the newly attached volume

sudo mkfs -t xfs /dev/nvme1n1
Output

Mount to /etc/lib/elasticsearch

sudo mount /dev/nvme1n1 /var/lib/elasticsearch/

For the volume to be automatically mounted edit /etc/fstab. But prior, make a copy because it seems improper fstab configuration can cause problems.

sudo blkid
sudo nano /etc/fstab

Paste the following line by replacing XXX with your own UUID from previous step.

UUID=XXX  /var/lib/elasticsearch  xfs  defaults,nofail  0  2

6. Start and verify ElasticSearch

sudo systemctl start elasticsearch
sudo systemctl enable elasticsearch
curl -X GET "localhost:9200"
Output if sucessful

If the above 3 commands ran without error and if the output of 3rd command matches the above, elasticsearch installation is complete.

7. Installing Kibana

sudo apt install kibana
sudo systemctl enable kibana
sudo systemctl start kibana

8. Installing NGinx (Optional if NGinx is installed)

sudo apt install nginx
sudo systemctl enable nginx
sudo systemctl start nginx

Enable port 80 in Security Group, in firewall (ufw) if you have and navigate to the public IP address of your computer and see if the NGinx page is displayed.

9. Installing Apache and securing Apache (Optional if you have a different web server and secured in a different way)

sudo apt install apache2

sudo apt-get install libapache2-mod-auth-openidc

sudo a2enmod auth_openidc

The next steps are optional, these steps are for securing the website at the server level i.e as a one person company, for now, I need to secure websites directly at the server level. If access rights are an issue, those need to be handled at the application level.

/etc/apache2/sites-available

cp 000-default.conf kibana.conf

sudo nano kibana.conf

uncomment the ServerName line and use your own domain.

Apache conf
sudo a2ensite kibana.conf //Enabling the new conf

sudo a2dissite 000-default.conf //Disabling the old conf

sudo apache2ctl configtest //Validate syntax

sudo systemctl restart apache2 //Restart Apache

Install SSL cert as mentioned in 11 and then proceed.

Install Apache OpenID Connect and secure

sudo apt install libapache2-mod-auth-openidc

Create a new app in Google Console and then follow these instructions. Here are the instructions: https://support.google.com/cloud/answer/6158849?hl=en

Modify the appropriate Apache .conf file for your choosen provider. Here is a sample for Google login.

<VirtualHost>
...


OIDCClaimPrefix "OIDC-"
    OIDCResponseType "code"
    OIDCScope "openid email profile"
    OIDCProviderMetadataURL https://accounts.google.com/.well-known/openid-configuration
    OIDCClientID <YourClientID>
    OIDCClientSecret <YourClientSecret>
    OIDCCryptoPassphrase <StrongCryptoPhrase>
    OIDCRedirectURI https://kibana.alightservices.com/v1/openid/callback

#The above URL can be any vanity URL

    <LocationMatch />
      AuthType openid-connect
      Require valid-user
      Require claim
      LogLevel debug
    </LocationMatch>


...
</VirtualHost>

9a) Securing using Auth0 (My preferred way due to some undisclosed reasons)

OIDCClaimPrefix "OIDC-"
    OIDCResponseType "code"
    OIDCScope "openid email profile"
    OIDCProviderMetadataURL https://alightservices.eu.auth0.com/.well-known/openid-configuration
    OIDCClientID <YourCLientId>
    OIDCClientSecret <YourClientSecret>

10. Install LetsEncrypt’s free SSL certificate for NGinx (Must, unless you have different form of SSL certificates)

sudo apt install certbot python3-certbot-nginx

Edit the nginx config file, here I am editing the default file:

sudo nano /etc/nginx/sites-available/

Add the following in the server block

server_name kibana.alightservices.com;

Verify and restart nginx

sudo nginx -t
sudo systemctl restart nginx

Generate certificates by issuing the following command and following the instructions:

sudo certbot --nginx

11. Install LetsEncrypt’s free SSL certificate for Apache (Must, unless you have different form of SSL certificates)

sudo apt install certbot python3-certbot-apache

sudo certbot --apache

12. Install Dex (Optional, configuring Dex is not covered in this article)

Dex needs go, gcc and build-essentials

sudo apt install make gcc build-essentials

curl https://go.dev/dl/go1.19.4.linux-amd64.tar.gz

rm -rf /usr/local/go && tar -C /usr/local -xzf go1.19.4.linux-amd64.tar.gz

export PATH=$PATH:/usr/local/go/bin

git clone https://github.com/dexidp/dex.git

cd dex/

make build

13. Configure Apache reverseproxy

Enable the following modules:

sudo a2enmod proxy
sudo a2enmod proxy_http
sudo a2enmod proxy_balancer
sudo a2enmod lbmethod_byrequests

In the appropriate .conf file remove “DocumentRoot” and add these lines:

ProxyPass / http://127.0.0.1:5601/
ProxyPassReverse / http://127.0.0.1:5601/

The validate the config file and restart apache

apachectl configtest

sudo systemctl restart apache2

14. Configure NGinx as a reverseproxy

Inside your nginx config file inside “server” block, configure “location” block to look like this:

location / {
        proxy_pass http://localhost:5601;
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection 'upgrade';
        proxy_set_header Host $host;
        proxy_cache_bypass $http_upgrade;
    }

Restart nginx

sudo systemctl rsetart nginx

That’s all voila ElasticSearch and Kibana are up and running! Injecting logs configurations etc… are the topics for another blog post.

References

Apache OpenID Connect example. (n.d.). Retrieved January 2, 2023, from https://docs.openathens.net/providers/apache-openid-connect-example

Boucheron, B. (2021, March 1). How To Secure Nginx with Let&#039;s Encrypt on Ubuntu 20.04. DigitalOcean Community. Retrieved January 2, 2023, from https://www.digitalocean.com/community/tutorials/how-to-secure-nginx-with-let-s-encrypt-on-ubuntu-20-04

Glass, E., & Camisso, J. (2022, April 26). How To Install Elasticsearch, Logstash, and Kibana (Elastic Stack) on Ubuntu 22.04. DigitalOcean Community. Retrieved January 2, 2023, from https://www.digitalocean.com/community/tutorials/how-to-install-elasticsearch-logstash-and-kibana-elastic-stack-on-ubuntu-22-04

Heidi, E. (2020, April 29). How To Secure Apache with Let&#039;s Encrypt on Ubuntu 20.04. DigitalOcean Community. Retrieved January 2, 2023, from https://www.digitalocean.com/community/tutorials/how-to-secure-apache-with-let-s-encrypt-on-ubuntu-20-04

Krantz, X. (2021, December 14). How to setup SSO for Elastic/Kibana with GitHub auth provider. Medium. https://medium.com/devobs/how-to-setup-sso-for-elastic-kibana-with-github-auth-provider-7268128977f9

Make an Amazon EBS volume available for use on Linux – Amazon Elastic Compute Cloud. (n.d.). AWS Documentation. Retrieved January 1, 2023, from https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ebs-using-volumes.html

OpenStack Docs: Setup OpenID Connect. (n.d.). Retrieved January 2, 2023, from https://docs.openstack.org/keystone/pike/advanced-topics/federation/openidc.html

ZmartZone IAM. (n.d.-a). GitHub – zmartzone/mod_auth_openidc: OpenID Certified&lt;sup&gt;TM OpenID Connect Relying Party implementation for Apache HTTP Server 2.x. GitHub. Retrieved January 2, 2023, from &lt;span&gt;https://github.com/zmartzone/mod_auth_openidc

ZmartZone IAM. (n.d.-b). Home · zmartzone/mod_auth_openidc Wiki. GitHub. Retrieved January 2, 2023, from https://github.com/zmartzone/mod_auth_openidc/wiki

Categories
.Net C#

Live C# development session – 1

In this video, I explained the purpose and kind of setup the basic application architecture. More videos would be done soon.

Mr. Kanti Kalyan Arumilli

Arumilli Kanti Kalyan, Founder & CEO
Arumilli Kanti Kalyan, Founder & CEO

B.Tech, M.B.A

Facebook

LinkedIn

Threads

Instagram

Youtube

Founder & CEO, Lead Full-Stack .Net developer

ALight Technology And Services Limited

ALight Technologies USA Inc

Youtube

Facebook

LinkedIn

Phone / SMS / WhatsApp on the following 3 numbers:

+91-789-362-6688, +1-480-347-6849, +44-07718-273-964

kantikalyan@gmail.com, kantikalyan@outlook.com, admin@alightservices.com, kantikalyan.arumilli@alightservices.com, KArumilli2020@student.hult.edu, KantiKArumilli@outlook.com and 3 more rarely used email addresses – hardly once or twice a year.

Categories
Security

ELK stack for centralized logging and monitoring

I have mentioned in previous blog articles about centralized logging and monitoring. I have experimented with various metrics collection tools and log tools. Currently all the logs are being ingested i.e collected but no proper analysis.

I have read about ELK stack and based on the articles and the availability of plugins, seems like ELK stack is the perfect choice.

Over the next few weeks I would be implementing ELK stack and would definitely share some knowledge.

In the past I have mentioned about the NIST Cyber Security Framework and as part of implementing NIST Cyber Security Framework and improving the security at ALight Technology And Services Limited, additional logging, monitoring and alerting systems are being implemented i.e ALight Technology And Services Limited’s stance when it comes to Cyber Security is hardened security is the top most priority before any kind of consumer / customer data is stored. This helps ALight Technology And Services Limited’s long term vision of providing several B2B, B2C free, paid and freemium products.

Mr. Kanti Kalyan Arumilli

Arumilli Kanti Kalyan, Founder & CEO
Arumilli Kanti Kalyan, Founder & CEO

B.Tech, M.B.A

Facebook

LinkedIn

Threads

Instagram

Youtube

Founder & CEO, Lead Full-Stack .Net developer

ALight Technology And Services Limited

ALight Technologies USA Inc

Youtube

Facebook

LinkedIn

Phone / SMS / WhatsApp on the following 3 numbers:

+91-789-362-6688, +1-480-347-6849, +44-07718-273-964

kantikalyan@gmail.com, kantikalyan@outlook.com, admin@alightservices.com, kantikalyan.arumilli@alightservices.com, KArumilli2020@student.hult.edu, KantiKArumilli@outlook.com and 3 more rarely used email addresses – hardly once or twice a year.

Categories
Security

The need for serious security I.T, current state of a sophisticated spies / hackers equipment

I wanted to do a live coding session for a little security utility / tool but ended up showing several things, the need for such a tool and talked about the sophisticated spies / hackers equipment. I will definitely do some live coding and open source the tool.

Categories
Security

An approach for securing some sensitive content

In the past I have mentioned about having proper MFA enabled VPN in some of my Youtube videos on ALight Technology And Service’s official Youtube channel (https://www.youtube.com/@alighttechnologyandservicesltd), I have come across a free VPN known as Pritunl, and Pritunl has SSO support and YubiKey support as per the documentation located here. However there is a glaring security issue in the setup process. The passwords and keys are generated and shown in plain text. This is a very big problem. So, I thought I would create a set of two tools that would do the following:

Tool-1 (on the server):

  1. Accepts a Key, IV i.e prompts for Key and IV, but when these are entered, the tool would not display the values i.e does not output the key and IV entered on the screen (more like prompting for a password).
  2. Prompts for a command to run
  3. Executes the command, captures the standard output and standard input.
  4. If there is a error – displays on the screen
  5. If no error, encrypts the standard output and displays on screen.

Tool-2 (on the clientside – on the laptop)

  1. Generate a IV, Key for symmetric encryption.
  2. Copy the Key to clipboard when required (button click for Windows application or some kind of console press key)
  3. Copy the IV to clipboard when required
  4. Accept a block of string, decrypt and copy the plaintext into clipboard.
  5. Whenever anything is copied into clipboard, automatically clear clipboard after a configurable time such as 10 or 20 seconds.

With these 2 tools, I can generate a new Key, IV pair, launch the server tool, input the key, IV. Then I can run some command, get the keys or passwords generated by commands encrypted and displayed. I can copy the outputted value on server into the desktop app, then decrypt and use wherever I want.

These are tools not necessary everyday but definitely necessary, especially if being targeted by hackers, spies and ransom asking goons (aka takers / all). I am considering open sourcing the code for these 2 tools. This code can also serve as an introduction to symmetric encryption in C#. The code would also have some usage of System.Diagnostics.Process class. I might even do a live coding session, shouldn’t take longer than 15 – 20 minutes. If I do a live coding session, I would explain the concepts.

Mr. Kanti Kalyan Arumilli

Arumilli Kanti Kalyan, Founder & CEO
Arumilli Kanti Kalyan, Founder & CEO

B.Tech, M.B.A

Facebook

LinkedIn

Threads

Instagram

Youtube

Founder & CEO, Lead Full-Stack .Net developer

ALight Technology And Services Limited

ALight Technologies USA Inc

Youtube

Facebook

LinkedIn

Phone / SMS / WhatsApp on the following 3 numbers:

+91-789-362-6688, +1-480-347-6849, +44-07718-273-964

kantikalyan@gmail.com, kantikalyan@outlook.com, admin@alightservices.com, kantikalyan.arumilli@alightservices.com, KArumilli2020@student.hult.edu, KantiKArumilli@outlook.com and 3 more rarely used email addresses – hardly once or twice a year.