I own multiple domain names, and each one hosts a different side project. For the longest time, everything that required ‘hosting’ was hosted on Heroku. But their free tier can be quite limited, it can also get costly quickly if you are paying for each separate project. So instead, I decided to explore putting all of them together using NGINX (recommended to me by Jane Manchun Wong).
Required Resources
Virtual Private Server (VPS)
You’ll need a virtual server such as DigitalOcean or EC2 by AWS. Personally I uses Vultr (here’s the non-referral link) which costs me about $2.50 / month.
Domain Names
You will need to register a few domain names. Assuming that you probably already have them, make sure your domain names are pointing at the name servers of your VPS. There should be a DNS section in your domain name service dashboard where you can select “custom DNS” or something similar. If you are not sure what the nameservers of your VPS are, you should be able to find that info easily through a simple search of “nameserver” + VPS service name.
Setting up NGINX
Installation and basic setup
Reference from How To Install Nginx on Ubuntu 16.04
Run the following commands through SSH-ing into the VPS. It will install NGINX, set firewall rules allowing it, and set NGINX to autostart on boot.
|
|
Configuration setup
Reference from Host Multiple Domains on One Server/IP with Apache or nginx
The default virtual.conf location should be at /etc/nginx/conf.d/virtual.conf. I recommend backing up the default file before making any changes. (If it doesn’t exist, you can just create it.) Edit the file to look something like the following:
|
|
Here are a few things to look at:
- server block — Each of these should represent each different domain or subdomain in use.
- root — This is the location where the (HTML) files are loaded from.
- server_name — (sub)domain name(s) that should load these specific files.
- proxy_redirect — in cases where you are redirecting a specific subdomain to an active server, you will want to add this and put the IP location after it. (For local servers, either http://127.0.0.1:port or http://localhost:port should work as intended.)
|
|
After you are done, restart the server so the new configurations will be loaded and applied.
Cloning and linking
Now remember, since you have your directory pointing at /opt/htdocs/websiteName, your initial thought might be to clone your projects into these folders. This can work, but it’s not ideal since many operations in these folders require root access to really do anything.
Instead, you can clone them into your user folder or anywhere else like you normally would, and then create a soft link to connect the path to your repository folder. Something like this:
|
|
Of course, when you are cloning a Node.js static site folder (ReactJS, Angular or Vue.js), you will want to install (npm install
) and build (npm run-script build
) them. Then link the ./build folder instead of the base level of the cloned repository. (Similarly for Jekyll sites, but use the ./_output folder instead.) As for active servers, just make sure your server is running on the same port as it is listed in the configuration file.
Set up HTTPS with certbot
Thanks to Let’s Encrypt, you can now get free and easy HTTPS certificates. With the introduction of certbot, everything just got even easier!
Reference from How To Secure Nginx with Let’s Encrypt on Ubuntu 16.04
|
|
Just run the above for all your domain and subdomain names and certbot will take care of everything. If you were to renew the certs, you can run the following so the certbot will help you renew your SSL certificate.
|
|
Updating everything
Now that you have everything up and running, you might be thinking, well there seems to be an awful lot to remember if/when I need to update something. Unfortunately, that’s kinda true, but we can always make it easier by adding a script that does it for us.
Here is how one would look:
|
|
Thanks for reading! Let me know if you have any questions in the comments below.
Other References
- nginx proxy pass redirects ignore port on serverfault
- Continue SSH background task/jobs when closing SSH on superuser
This article was originally published on freeCodeCamp Medium.