Updated on: 28.Feb.2021
Nginx is the web server for hosting the web pages and it can work as reverse proxy.
sudo apt install nginx
sudo systemctl status nginx
sudo ufw allow 'Nginx Full'
- To edit the configuration,
sudo nano /etc/nginx/sites-available/default
To host Node.js application, you will have to use reverse proxy. For example, our Node.js is running web service on port 8080 (and on the same server), you will have to add the following to the "default" file,
location /app1 {
proxy_pass http://localhost:8080/;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_cache_bypass $http_upgrade;
}
Note: "/app1" does not end with "/". As a result, if the visitor has typed "app1/", they will see 404 error.
To enable the support to PHP,
location ~ \.php$ {
include snippets/fastcgi-php.conf;
fastcgi_pass unix:/var/run/php/php7.3-fpm.sock;
}
If you have a PHP application that was hosted in Apache before and would like to host it in Nginx, you will have to migrate the .htaccess rules into "default" file.
To do this, you will have to copy the contents in .htaccess and follow the instructions in the following converter:
For example, the following will block the public from accessing the .htaccess file:
location ~ /\.ht {
deny all;
}
For hosting static files, you may considering make those files cache able. The following configuration will make all files in CSS, JS and images directory cache able.
location ~* ^/(css|js|images) {
expires 1h;
access_log off;
add_header Cache-Control "public";
}
- To blocking crawler, you may add the following to the Ngix config file (below "server_name" line). In the following setting, it blocks both Bytespider and Semrushbot.
if ($http_user_agent ~ (Bytespider||SemrushBot)) {
return 403;
}
Notes: in fail2ban article, we mentioned that you may block the crawler with a new filter. The problem is that fail2ban ban the crawler after the it has downloaded some files from your web server. If you don't want the crawler to take anything from your website, you should implement the banning in the web server (which prevent downloading any files) and also fail2ban (which prevent them to come back again).
- To test the config before restart nginx
sudo nginx -t
sudo systemctl restart nginx
- To view the error generated by php or file request:
sudo cat /var/log/nginx/error.log
goaccess - the web traffic statistics report generator
echo "deb http://deb.goaccess.io/ $(lsb_release -cs) main" | sudo tee -a /etc/apt/sources.list.d/goaccess.list
wget -O - https://deb.goaccess.io/gnugpg.key | sudo apt-key --keyring /etc/apt/trusted.gpg.d/goaccess.gpg add -
sudo apt update
sudo apt install goaccess
- To find the config file location
goaccess --dcf
sudo nano /etc/goaccess/goaccess.conf
- Uncomment the following options in the config file. This is compulsory step!!
time-format %H:%M:%S
date-format %d/%b/%Y
- Either one of the following must be uncommented:
# NCSA Combined Log Format
log-format %h %^[%d:%t %^] "%r" %s %b "%R" "%u"
# OR NCSA Combined Log Format with Virtual Host
log-format %v:%^ %h %^[%d:%t %^] "%r" %s %b "%R" "%u"
sudo goaccess /var/log/nginx/access.log
- To compile report from various files:
sudo zcat /var/log/nginx/access.log.*.gz | goaccess -a
sudo goaccess /var/log/nginx/access.log -o /var/www/myweb-com/my-rpt/stats.html
# historical - the aggregate report based on all the compress Nginx log files
# For processing the piping the data, you have to use "-" (dash symbol).
# So, it is not a mistake to have "-" "space" "-o" arguments.
sudo zcat /var/log/nginx/access.log.*.gz | sudo goaccess - -o /var/www/myweb-com/my-rpt/stats0.html
- Then, save the above command into the following file
/usr/local/bin/gen-webstat.sh
sudo chmod +x /usr/local/bin/gen-webstat.sh
sudo chown root:www-data /usr/local/bin/gen-webstat.sh
- Then, schedule the report generation,
sudo crontab -e
# Add the following line into the file
55 23 * * * /usr/local/bin/gen-webstat.sh
15 0 * * * /usr/local/bin/gen-webstat-agg.sh
- To view what you have scheduled,
sudo crontab -l
- By default, yesterday's Nginx log file will not be compressed and this will lead to the gen-webstat-agg.sh excluded yesterday from the log, you will find it difficult to interpret the aggregate report. In this case, you have to make sure that logrotate will compress yesterday log file.
sudo nano /etc/logrotate.d/nginx
# Then, comment the following line in the config file.
## delaycompress
# Save the config file.
No comments:
Post a Comment