In the above code we have declared an object with two properties and named the object person. The individual properties are accessed and the values are printed to the console.
We can use destructuring to extract the properties into variables.
React is a SPA library built by Facebook and has very wide usage. Angular and Vue offer similar functionality. The three frameworks have their own advantages and disadvantages which is out of scope.
For quick prototyping and for other reasons, there might sometimes be a need for using React in-browser i.e include some Javascript files from CDN, write some Javascript and see these in usage.
In this blog post, I won’t go into great lengths or details, but more of a getting started blog post.
I am assuming you would like to use JSX syntax and would like babel to transpile in-browser.
Note: Assuming you are developing quick prototype, I have provided the development version links for React 17.0.2. But there are minimized production versions. The development scripts are useful for easier debugging because of detailed error messages.
Now include another .js file of your choice and include the js file in your HTML page – assuming the .js file has a name of RootComponent.js
Note the type is “text/babel” because we would be using JSX for our markup.
Add a div with a id of “root” in your HTML page which would become the container for our React application. I have added Please wait… because the browser would display Please wait… while React framework initializes.
<div id="root">Please wait...</div>
Now add the following code snippet to your RootComponent.js
In the above few lines of code, we have used plain javascript for getting the div with an id of “root”.
We have defined a minimal functional component by the name of App.
We are using React library and calling a method called Render and asking React library to render the App component in the place of the root element i.e the div with the id of “root”.
CSRF – Cross Site Request Forgery is a certain type of cyber attack, specifically when using cookies!
A different website would post content into a different domain when the user of the other domain is logged in or in certain other scenarios. CSRF is considered one of the major vulnerabilities and has been in the OWASP top 10 – Cross Site Request Forgery (CSRF).
If you are using token based authentication and if the token is stored in browser’s local storage, CSRF isn’t an issue. This is specifically when using cookies.
Basic Usage:
In .cshtml of web pages inside forms add the following tag:
@Html.AntiForgeryToken()
The above code fragment would render a hidden input element with a long random string.
In the controller class, decorate the action method with the following attribute:
[ValidateAntiForgeryToken]
When the action method is invoked, the validation happens. If the validation succeeds, the action method get invoked. If the validation fails, the action method does not get invoked.
Recommended Usages:
If we forget decorating a post method with [ValidateAntiForgeryToken], we would be susceptible to CSRF attack. Instead we can use a MiddleWare and use the Middleware in the Startup.cs
There are other ways of customizing the middleware, for example if there is a use-case where json data is being sent to a web api and cookies are used for authentication, we can add a customizable header i.e in the calling code we would add the hidden element’s value as header and then make the call.
Don’t use easy to guess passwords. Use a password manager. I would say, even you should not know your password!
Do NOT display password in plain text anywhere!
Always use MFA – Multi Factor Authentication! I would write a separate article about various MFA techniques, strengths and weaknesses. And separate articles for web developers.
Always look for https when submitting sensitive information. Nowadays, this is a lesser threat because most websites are using https, but in some very rare scenarios some websites have no SSL but prompt for password. I came across one such website in the past 6 years. Don’t want to discuss the details, but contacted the owner and suggested some free alternatives such as Let’sEncrypt.
Do not connect to public wifi, use VPN.
Even private wifi, use VPN as much as possible.
Some websites such as FaceBook, Outlook, GMail allow verifying current sessions and activities. Review periodically.
NWebSec is a library, that I am familiar and have used in some web applications over the past 3 – 4 years.
Modern web browsers support several HTTP headers for security related purposes. For example, not to cache content, always require HTTPS etc… Most, if not all of these headers can be set at the webserver level instead of at the application level. There are various guides and blog posts for doing the same on HTTP servers.
This blog post is about using NWebSec for setting these headers in ASP.Net web applications. I won’t go over the entire list but I would mention few.
Configure the X-Frame-Options either at the application level or at the web server level, unless you have a need for your websites to be displayed in iframes of other websites.
Always use https by using the Content Security Policy, Strict Transport Security, Upgrade Insecure Requests.
Sometimes there would be a need to format the output of SQL statement into a format such as XML or JSON.
SQL Server has “FOR XML” clause. I have been familiar with SQL Server in the past. I had a similar use-case with MySQL. MySQL has functions for outputting as JSON.
Assume we have a table ‘Sample’ with columns Id, Name.
MySQL Table – Sample
Here is some sample data:
Sample data
We can use JSON_OBJECT to create JSON objects like this:
select json_object('id', id, 'name', name) from sample;
In the above line of code we are asking that for each row create a JSON document with the attribute names of ‘id’, ‘name’ and use the column values of id, name for values. The output looks like this:
JSON_OBJECT output
We can use JSON_ARRAYAGG for aggregating the values into a single result like this, and even works with grouping.
select json_arrayagg(id) from sample;
The following output would be obtained.
Output of JSON_ARRAYAGG
Combing these two functions to generate a single JSON document with array for each row of the output.
select json_arrayagg(json_object('id', id, 'name', name)) from sample;
How many of you think having advanced search capabilities for websites would be nice? Advanced search as in, for example – searching for content inside a word file or pdf file? Or may be search results showing up as soon as you start typing? Or may be showing search suggestions like Google but in the search box of your own websites?
Why would anyone need such a search?
Your customers or your website visitors would have the ability to search and find the information they need accurately and fast.
Some researchers have shown most users have a attention span of 7 – 8 seconds before going to the next website. You would have spent a lot of effort on Search Engine Optimization to get people to know your website. Now, if people can find what they are looking for quickly and accurately wouldn’t that help? May be the prospective visitor ends up being a sales lead and a customer.
This is the concept for an upcoming product. The product was internally code named as WebSearch but then wanted a unique name for the product and renamed as WebVeta. Veta in my mother tongue language Telugu means Hunt. In other words hunt for your files / content.
If this concept seems appealing and if you think you might have a need, please do contact.
Few example scenarios:
Scenario – 1: Let’s say you have a multi-nation presence and all of your company addresses are mentioned somewhere in the website. And someone from Australia wanted to find your U.S.A office address or phone number – how about they start typing “U.S.A pho” and the U.S.A phone number shows up?
Scenario – 2: Let’s say you have a global corporate website, a u.k based website, a U.S.A based website with URL’s between the 3 websites. But search inside each website shows results for only that website. What if the 3 websites can show consistent search results including advanced search capabilities across the three websites? i.e irrespective of on which website, your customer is can find information from across your global corporate websites.
Most of you know I like sharing my knowledge. Here are some simple but very useful DOM manipulation functions in Javascript.
As part of development for WebSearch, I wanted leaner Javascript and for some part of the development, I am using direct Javascript rather than libraries such as jQuery. jQuery has these functionality and allows easier development. My situation and necessity are a bit different.
var ele = document.getElementById("elementid");
// for getting a reference to an existing element in the DOM
var dv = document.createElement("div");
// for creating a in-memory element.
parentEle.appendChild(childEle);
// for adding an element as a child element of another element
ele.id = "elementId";
// Setting id of element
ele.classList.add("cssClass");
ele.classList.remove("cssClass");
// Adding and removing css classes
ele.innerText = "Text";
ele.innerHTML = "<>...M/>";
// Setting text and innerHTML
// caution with innerHTML - don't inject unsafe/unvalidated markup
ele.addEventListener("event", (ev) => {
// Anonymous function
});
// Handle events such as click etc...
ele.addEventListener("event", fnEventHandler);
// Handle events by using a function - fnEventHandler
There are various solutions for collecting, storing and viewing metrics. This blog post is specifically about the following list of software:
CollectD – For collecting system metrics
Carbon-Relay-ng – Like a server but forwards the metrics into Graphite
Hosted Graphite at Grafana.com – The backend that stores the metrics
Grafana – For viewing metrics
Grafana for alerts
Collectd
Collectd is a very light-weight, low memory, low CPU usage Linux tool that runs as a service and can collect various system related metrics. Collectd is very extensible and has several plugins. Some of the plugins, I like and have used are:
Apache web server – Gathers Apache related stats
ConnTrack – Number of connections in Linux connection tracking table
ContextSwitch – Number of context switches
CPU
DNS
IP-Tables
Load
MySQL
Processes
tcpconns
users
vmem
My favorite output plugins and some I am familiar with are:
CSV
Write Graphite
gRPC
Carbon-relay-ng
This is not necessarily my favorite, because little heavy on system resources 🙁
Now host Carbon-relay-ng on one of the servers, Install Collectd on the servers that need to ingest metrics. Use Collectd’s Write_Graphite for ingesting metrics into Carbon-relay-ng. Configure Carbon-relay-ng to ingest metrics into hosted Graphite on Grafana.com.
For ingesting any code-based metrics use ahd.Graphite.
var client = new CarbonClient("example.com");
var datapoints = new[]
{
new Datapoint("data.server1.cpuUsage", 10, DateTime.Now),
new Datapoint("data.server2.cpuUsage", 15, DateTime.Now),
new Datapoint("data.server3.cpuUsage", 20, DateTime.Now),
};
await client.SendAsync(datapoints);
//Sample code from - https://github.com/ahdde/graphite.net
I would say instead of instantiating too many instances, use either singleton or use a very small pool of instances.
I have promised to semi-open-source some code from my upcoming project – Alerts in the anouncements blog. Anyone with some programming knowledge, can implement such a solution by following this blog. This would be implemented slowly because I am planning to get normal 9 – 5 job, instead of joining or participating in the r&aw dawgs human rights violation, game of loans, game of identity distortion (in this case, I am the victim and their offer, if I participate – identity distortion of some American – sorry, I am not a psycho)
Moreover, for at least 6 – 12 months, the project would be offered completely free of charge for some companies / individuals who see a need and can provide feedback.
OpenTelemetry is pretty much like logs and metrics with distinguishable TraceId’s.
Yesterday and this morning I have experimented with OpenTelemetry in a sample ASP.Net MVC application.
The Primary components are:
A host for Tempo – using Grafana hosted Tempo – https://www.grafana.com. Grafana has a very generous 100GB traces per month in the free tier.
Grafana Agent – As of now, I have used Grafana Agent on Windows laptop, have not configured on Linux production servers yet. Grafana Agent can be downloaded from here. Click on the releases in the right side and choose the Operating System. Here is the link for v0.31.0.
Add the following pre-release dll’s to your ASP.Net MVC application.
“OnLINE Erra, Thota terrorist bastards are spy bastards, they don’t command me, I do whatever I like, because they use invisible spying drone they try to frame me“
We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept All”, you consent to the use of ALL the cookies. However, you may visit "Cookie Settings" to provide a controlled consent.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie
Duration
Description
cookielawinfo-checkbox-advertisement
1 year
Set by the GDPR Cookie Consent plugin, this cookie is used to record the user consent for the cookies in the "Advertisement" category .
cookielawinfo-checkbox-analytics
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional
11 months
The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy
11 months
The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.