NWebSec is a library, that I am familiar and have used in some web applications over the past 3 – 4 years.
Modern web browsers support several HTTP headers for security related purposes. For example, not to cache content, always require HTTPS etc… Most, if not all of these headers can be set at the webserver level instead of at the application level. There are various guides and blog posts for doing the same on HTTP servers.
This blog post is about using NWebSec for setting these headers in ASP.Net web applications. I won’t go over the entire list but I would mention few.
Configure the X-Frame-Options either at the application level or at the web server level, unless you have a need for your websites to be displayed in iframes of other websites.
Always use https by using the Content Security Policy, Strict Transport Security, Upgrade Insecure Requests.
OpenTelemetry is pretty much like logs and metrics with distinguishable TraceId’s.
Yesterday and this morning I have experimented with OpenTelemetry in a sample ASP.Net MVC application.
The Primary components are:
A host for Tempo – using Grafana hosted Tempo – https://www.grafana.com. Grafana has a very generous 100GB traces per month in the free tier.
Grafana Agent – As of now, I have used Grafana Agent on Windows laptop, have not configured on Linux production servers yet. Grafana Agent can be downloaded from here. Click on the releases in the right side and choose the Operating System. Here is the link for v0.31.0.
Add the following pre-release dll’s to your ASP.Net MVC application.
“OnLINE Erra, Thota terrorist bastards are spy bastards, they don’t command me, I do whatever I like, because they use invisible spying drone they try to frame me“
And I have discussed about a possibility of capturing more information in logs only when needed such as in the case of errors or exceptions in the following blog post:
I am planning to use Gelf logging for easier compatibility reasons. Gelf logs can be ingested into pretty much every major centralized logging platforms such as: Kibana, GrayLog, Seq, Grafana. Some would require some intermediary software to accept Gelf formatted logs and some can directly ingest Gelf formatted logs. However, for various reasons, sometimes the logging server might not be available, specifically when the log ingestors are not in a cluster. Log files can be easily ingested into the above mentioned centralized logging agents using different sofware.
Based on the above use-case I wanted to use Gelf for directly logging into the centralized logging server and as a failover, I want to write the logs to a file that would get ingested at a later point by some other software.
Now, by combing the previous post example, we can achieve AspNetBuffering and ingest different levels of logs only when errors occur. The code samples should be very easy to understand.
In the above code we have wrapped Gelf logger, File logger inside a FallBackGroup logger. The FallBackGroup logger is wrapped inside a PostFilteringWrapper. The PostFilteringWrapper is wrapped inside a AspNetBufferingWrapper.
In the above code in the <rules> section we are sending all Debug and above logs to the AspNetBufferingWrapper.
Now AspNetBufferingWrapper buffers the log messages for an entire request, response cycle and sends the log messages to the PostFilteringWrapper.
The PostFilteringWrapper sees if there are any Warnings or above loglevel, if yes sends all the messages that have Debug and above loglevels. Else sends Info and above messages. The target of PostFilteringWrapper is the FallbackGroup logger which receives these messages.
The FallBackGroup logger attempts to use the Gelf logger, if the Gelf logger is unable to process the messages, the logs are sent to the File logger.
And I have discussed about a possibility of capturing more information in logs only when needed such as in the case of errors or exceptions in the following blog post:
The above configuration by default logs Info and above logs, but if there is a Warn or higher, logs debug or higher. For this to work properly obviously this logger has to receive Debug messages otherwise there is no point in using this logger.
Now combing these two loggers, here is an example:
I have created a bucket in S3 with the following retention policies:
AWS S3 Object Lock Policy
I personally don’t have to follow compliance yet, but nothing wrong in implementing compliance policies.
I have also defined a life-cycle policy to transition objects into Standard-IA (Infrequent Access) after 30 days.
Now I am developing a Lambda that would create Export tasks in CloudWatch once a week.
Here are some relevant C# code snippets:
var _client = new AmazonCloudWatchLogsClient(RegionEndpoint.EUWest2);
// Initialized AmazonCloudWatchLogsClient
var response = await _client.DescribeLogGroupsAsync();
// Get a list of LogGroups
foreach(var logGroup in response.LogGroups)
{
var prefix = $"{from}-{to}-{logGroup.LogGroupName}";
// You can define your own prefix
var exportResult = await _client.CreateExportTaskAsync(new
CreateExportTaskRequest
{
Destination = "<NAME_OF_S3_BUCKET>",
DestinationPrefix = prefix,
From = GetUnixMilliSeconds(from),
LogGroupName = logGroup.LogGroupName,
TaskName = prefix,
To = GetUnixMilliSeconds(to),
})
};
The above code is pretty much self-explantory. Here is a code snippet for getting Unix MilliSeconds from epoch.
long GetUnixMilliSeconds(DateTime dateTime)
{
var _epoch = new DateTime(1970, 1, 1, 0, 0, 0, 0);
return (dateTime.Ticks - _epoch.Ticks) / 10000;
}
I had a need to generate random passwords and / keys and update various config files. For example, keys and passwords used by log ingesting utilities such as FileBeat, PromTail, MetricBeat etc…
In earlier blog posts, I have mentioned, that at this point log ingestion, retention and major alerts implementation is complete. So, obviously the next part is securing the keys.
I know the hacker spies – India’s psychopath R&AW spies can and are seeing any plain-text items on screen and if I am not wrong, they might have even hacked into my accounts several times. Yes, they say they are investigation teams etc… bull-shit but in reality they are corrupted and are the criminals i.e greedy investigators / spies who did crime and are trying to get away from crime.
Anyway, because I know how the “prying eyes” equipment works, I need to defend myself from the hacker spies as much as possible. For more info about this scam: https://www.simplepro.site.
Here is a small C# code snippet for reading from console without echoing back:
Now everyone knows how to do open a file, read content and replace content. A simple program can be developed that would take the path of config file, old value, new value and replace.
i.e for example during test, alpha modes if a key is “KEY” and then later if you use a random password generator that would generate password and copy into memory, this type of small tool can help with replacing “KEY” with the “RAND0M P@$$W0rd”.
Some code sample:
Console.WriteLine("Enter filepath:");
var fileName = Console.ReadLine();
var sr = new StreamReader(fileName);
var content = sr.ReadToEnd();
sr.Close();
Console.WriteLine("Enter Search Phrase:");
var searchPhrase = Console.ReadLine();
var matchedIndex = content.IndexOf(searchPhrase);
if(matchedIndex >= 0)
{
Console.WriteLine("Match found.");
Console.WriteLine("Enter replacement text:");
var replacementText = GetSensitiveText();
var sw = new StreamWriter(fileName);
sw.Write(content.Replace(searchPhrase, replacementText));
sw.Flush();
sw.Close();
}
We prompt for the path to the config file, prompt for the search text. If the search text is found, we prompt for the secret i.e the replace text. But, we don’t echo the new sensitive info to the Console. Then the search text is replaced with new sensitive info and then we write the contents back to the file.
In the past, I have written about a little known utility for collecting metrics known as CollectD. CollectD samples various metrics at configured interval and outputs to various destinations. This particular blog post is about having CollectD send the metrics to a GRPC endpoint, the endpoint can decide how to further process the received data. In this blog post, I would be writing about C# GRPC server for receiving data, but in reality most programming languages that support GRPC can be used.
One more thing, having CollectD use GRPC is slightly complex, because several different libraries need to be installed. Here is a list for Ubuntu, this is not an exhaustive list, but the list of libraries that I had to install on Ubuntu to allow CollectD report metrics using GRPC – gcc, gpp, build-essential, protobuf-compiler-grpc, libprotobuf-dev, protobuf-compiler, libgrpc++-dev. The best way to find any missing libraries is to compile CollectD from source as mentioned in https://collectd.org/download.shtml, and after ./configure look for missing libraries beside grpc until the output shows grpc – yes.
Now for the C# server, here is the .proto file, I have used:
The implementation for the C# server is very simple. I have set the protobuf compiler to only generate the server side code. Create class that inherits from CollectdBase. Override the method PutValues. Remember the request is a stream.
public override async Task<PutValuesResponse> PutValues(IAsyncStreamReader<PutValuesRequest> requestStream, ServerCallContext context)
{
while (await requestStream.MoveNext())
{
var currentItem = requestStream.Current;
//Do something with currentItem
}
return new PutValuesResponse();
}
*Instead of hardcoing IP address or “localhost”, I would say use some name such as “elasticsearch” or “kibana” and then use the HOSTS file for mapping to the actual server. Then even if you have several applications on the same server and if the elasticsearch server gets changed, you don’t have to edit all the config files, you can edit just the hosts file. hosts file is located at /etc/hosts on Linux and C:\Windows\System32\drivers\etc\hosts on Windows.
Now we will discuss about 4 different interesting wrappers:
Buffering Wrapper
Async Wrapper
AspNetBuffering Wrapper
FallbackGroup Wrapper
These 4 loggers are wrappers i.e these loggers don’t write logs directly. Instead they are used to wrap other loggers by providing some interesting functionality that can be used to take advantage based upon necessity and use-case.
Buffering Wrapper
Buffers log events and sends in batches.
As mentioned above in the ElasticSearch example, the wrapper would buffer messages and sends in batches.
There is a very interesting use-case by using AutoFlushWrapper with BufferingWrapper and the actual target that writes the logs, such as writing the logs only when error happen.
2. Async Wrapper
When you don’t need buffering but at the same time if you don’t want your application to wait until logging is done, this could be useful.
This wrapper can be used for wrapping around multiple targets. For example ElasticSearch followed by Cloudwatch followed by File. i.e if the logger is unable to write to ElasticSearch, it would write to Cloudwatch, if that too failed it would write the logs into file.
Centralizing config generation into a re-usable library, having a wrapper class around reading the config. Now the consuming classes do not need to know the details of where or how to get config values. For example, the config can be stored in plain-text / encrypted, can be stored in text files or something like AWS Secrets Manager.
public static void GenerateConfig()
{
var retVal = new ConfigurationBuilder()
.SetBasePath(Directory.GetCurrentDirectory())
.AddJsonFile(<ConfigFile>, optional: false, reloadOnChange: true)
.AddJsonFile(<ConfigFile2>, optional: false, reloadOnChange: true)
.AddEnvironmentVariables()
.Build();
ConfigHelper.Configuration = retVal;
}
Code for converting DateTime into Unix epoch and back to DateTime
private static DateTime epoch = new DateTime(1970, 1, 1, 0, 0, 0);
public static long GetUNIXDate(DateTime value)
{
return (Int64)value.Subtract(epoch).TotalSeconds;
}
public static DateTime GetNormalDateFromUnix(long value)
{
return epoch.AddSeconds(value);
}
Code for determining if a string consists entirely of ASCII text or not
Code for removing non-ASCII characters and retrieving just the ASCII characters from a string
private static readonly Regex asciiRegex = new Regex(@"[^\u0000-\u007F]+", RegexOptions.Compiled);
public static string GetASCIIOnly(string value)
{
if (value == null) return String.Empty;
return asciiRegex.Replace(value, String.Empty);
}
Code for getting a smaller chunk of a long string and if necessary append … at the end – useful on web pages showing first few characters of a longer text.
public static string GetSnippet(string value, int length, bool appendDots)
{
if (String.IsNullOrWhiteSpace(value)) return String.Empty;
if (value.Length < length - 3) return value;
if (appendDots)
{
return $"{value.Substring(0, length - 3)}...";
}
else
{
return value.Substring(0, length);
}
}
We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept All”, you consent to the use of ALL the cookies. However, you may visit "Cookie Settings" to provide a controlled consent.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie
Duration
Description
cookielawinfo-checkbox-advertisement
1 year
Set by the GDPR Cookie Consent plugin, this cookie is used to record the user consent for the cookies in the "Advertisement" category .
cookielawinfo-checkbox-analytics
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional
11 months
The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy
11 months
The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.