Contents

AWS Instance set up for blog and portfolio

TL;DR

  • Create an EC 2 (or other instance from any cloud service provider)
    • With ssh login setup (security rules, private key, non-root user)
    • Domain name
    • Install dependencies (Nginx…For SSL/TLS, I use certbot)
    • Nginx configuration
  • Blog setup
    • I use HUGO
    • Content (.md files)
    • Make sure your blog service could generate static files
  • Serve the content via Nginx
    • Use GitHub repo to manage blog project
    • Use GitHub Actions to automatically deploy static files on the instance

Launch EC2

AWS or GCP

  • Create a new Instance
  • On your local machine (which is used to ssh login to the instance)
    • Generate pem as private key file
    • Make the key private by chmod 400 "key_name.pem"
    • Connect to instance using public DNS ssh -i "Hub_AWS_MacMini_1.pem" ubuntu@ec2-{address}.us-west-1.compute.amazonaws.com
chmod

chmod: “change mode”, it’s used to change the access permissions of files and directories.

400 is octal notation for setting permissions. read(4), write(2), and execute(1). These digits are added together to get the combined permissions for each class of users. The first digit is for the user (owner) of the file. The second is for the group. The third is for others. So 400 sets the permissions to be read-only for the owner, and no access for the group or others. This is common setting for sensitive files like private keys.

Other common chmod code:

644: read/write for owner, read-only for group and others, for text documents;

755: read/write/execute for owner, read/execute for group and others, for scripts and programs;

777: allow anyone;

When syncing with github, in git repo, use git config core.fileMode false to disable file permission change detection.

Install dependencies

Update package lists and upgrade the installed packages to latest versions.

sudo apt update
sudo apt upgrade -y

Install Nginx

Config Nginx

  • Domain name
  • SSL/TLS
  • root or alias
sudo chown -R ubuntu:www-data 
/home/ubuntu/public sudo chmod -R 755 /home/ubuntu/public

`sudo ln -s /etc/nginx/sites-available/yourblog /etc/nginx/sites-enabled/`

Test the config

Use scp to transfer some static files to test if config is ready.

scp -r -i ./Hub_AWS_MacMini_1.pem {some Next project build dir}/out/ ubuntu@ec2-54-241-83-0.us-west-1.compute.amazonaws.com:/home/ubuntu/    

Set up HUGO

  • Install the latest version of HUGO
  • Config the basics and choose a theme
  • Copy your markdown files into content/posts
# create a new website:
hugo new site my_website

# install theme
git submodule add https://github.com/dillonzq/LoveIt.git themes/LoveIt

# create a post
bash<br>hugo new posts/first_post.md

# launch locally (as theme recommend, .Scratch)
hugo serve --disableFastRender

# build
hugo --minify

CI/CD pipelines

Create a workflow in repo

name: Deploy to AWS

on:
  push:
    branches: [ main ]

jobs:
  hugo-build-scp:
    runs-on: ubuntu-22.04
    steps:
      - uses: actions/checkout@v4
        with:
          submodules: true  # Fetch Hugo themes (true OR recursive)
          fetch-depth: 0    # Fetch all history for .GitInfo and .Lastmod

      - name: Setup Hugo
        uses: peaceiris/actions-hugo@v2
        with:
          hugo-version: '0.123.8'
          # extended: true

      - name: Build
        run: hugo --minify

      - name: Setup SSH
        run: |
          mkdir -p ~/.ssh
          echo "$SSH_PRIVATE_KEY" > ~/.ssh/id_rsa
          chmod 600 ~/.ssh/id_rsa          
        env:
          SSH_PRIVATE_KEY: ${{ secrets.SSH_PRIVATE_KEY }}
        shell: bash

      - name: Update known_hosts
        run: ssh-keyscan -H $AWS_INSTANCE_IP >> ~/.ssh/known_hosts
        env:
          AWS_INSTANCE_IP: ${{ secrets.AWS_INSTANCE_IP }}
        shell: bash

      - name: Copy /public to AWS instance
        run: scp -r public/ $AWS_INSTANCE_USER@$AWS_INSTANCE_IP:~/
        env:
          AWS_INSTANCE_IP: ${{ secrets.AWS_INSTANCE_IP }}
          AWS_INSTANCE_USER: ${{ secrets.AWS_INSTANCE_USER }}
        shell: bash

Now when you push your latest changes into the repo, the serve directory in AWS instance will be updated automatically.

Local management for markdown files

Since I use Obsidian as main note-taking tool, all the md files are Obsidian Vault folder. Instead of copying blog post files back and forth, I create hard links to sync them between both directories.

A Symlink or a Symbolic Link is a shortcut to another file, a file that points to another file. To create a symbolic link, use the following command:

ln -s target_path link_path

A hard link points directly to a certain object, a symbolic on the other hand does not point directly to an object, it just saves a path to an object. Soft link is more of a shortcut to the original file, hard link is more of a mirror copy or multiple paths to the same file (inode). With hard link, do something in file1 and it appears in file2. The inode (file) is only deleted when all the hard links or all the paths to the (same file) inode has been deleted.

ln target_file link

After create link in content/posts, I could just edit them in Obsidian and the change would appear in HUGO site.

To be continued

  • Portfolio Set up
  • Detail Nginx config