Skip to content

Useful Ubuntu Commands

Shamik edited this page Mar 19, 2026 · 7 revisions

Useful Ubuntu Commands

To configure a default Ubuntu editor

sudo update-alternatives --config editor

Environment variables

Checking and adding environment variables.

To print all environment variables

printenv

To print a specific environment variable(HOME)

printenv HOME
echo $HOME

To print a specific environment variable(PATH)

printenv PATH
echo $PATH

To print a multiple environment variables(HOME & PATH)

printenv HOME PATH

Adding a specific path to the $PATH environment variable

# replace my username with yours in the command below
export PATH="/home/shamik/.local/bin:$PATH"

To set an envrionment permanently

In order to configure a new environment variable to be persistent, we’ll need to edit the Bash configuration files. This can be done through three different files, depending on exactly how you plan to access the environment variable.

  • ~/.bashrc – Variables stored here will reside in the user’s home directory and are only accessible by that user. The variables get loaded any time a new shell is opened.
  • /etc/profile – Variables stored here will be accessible by all users and are loaded whenever a new shell is opened.
  • /etc/environment – Variables stored here are accessible system-wide.

Add it to ~/.bashrc, accessible only by the user

# replace my username with yours in the command below
export PATH="/home/shamik/.local/bin:$PATH"

Add it to /etc/profile, accessible to all users

# replace my username with yours in the command below
export PATH="/home/shamik/.local/bin:$PATH"

Add it to /etc/environment, accessible system-wide

# replace my username with yours in the command below
PATH="/home/shamik/.local/bin:$PATH"

Delete Multiple files without removing the directory

rm -r <path of the folder>

Error:argument list is too long

This happens because of the number of files being too many so the following will work:

find <path of the folder> -type f -exec rm {} +

Finding files with a specific file extension

find <path to dir> -type f ! \( -name "<file extension>" -o -name "<file extension>" \)
# find ./testing_metrics/ -type f ! \( -name "*.csv" -o -name "*.parquet" \) 

Finding files with a specific file extension and deleting the files

find <path to dir> -type f ! \( -name "<file extension>" -o -name "<file extension>" \) -delete
# find ./testing_metrics/ -type f ! \( -name "*.csv" -o -name "*.parquet" \) -delete

Find case insensitive filename

find . -type f -iname "*some_name*"

Zip a folder

zip –rj <zipped folder path> <path to the folder>

Adding new files to the zipped folder

zip –rju <zipped folder path> <path to the folder>

Zip a set of files

zip <zipped folder path> <file1> <file2> <file3> <file4>

Unzip file to current directory

unzip <file name>

Unzip file to a specific directory

unzip <file name> -d <path to the directory>

List all files in a zipped folder

unzip –Z <folder name>

Tar a dir

tar -cvzf <path/where/to/save/the/zipped.tar.gz> <path to dir>
# tar -cvzf roberta-large-iter2-v2.tar.gz roberta-large-iter2-V2/

Tar a given folder without it's absolute path; don't forget the "." at the end to specify the cd

tar –cvzf <path/where/to/save/the/zipped.tar.gz> -C <folder/path/> . 
# tar -cvzf Data/FS\ Data/original_models_threshold_analysis.tar.gz -C Data/FS\ Data/threshold_analysis/ .

Tar a given file without it's absolute path

tar -cvzf <path/where/to/save/the/zipped.tar.gz> -C <file path>
# tar -cvzf Repos/iTOP-semantic-apply-model/notebooks/Concepts_Filtered_df.tar.gz -C ~/Repos/iTOP-semantic-apply-model/notebooks/ Concepts_Filtered_df.csv

Tar a given file without it's absolute path with bzip compression instead of gzip

It compresses the file even more than gzip.

tar -cjvf <path/where/to/save/the/zipped.tar.bz2> -C <file path>
# tar -cjvf ~/Repos/iTOP-semantic-apply-model/notebooks/final_df.tar.bz2 -C ~/Repos/iTOP-semantic-apply-model/notebooks/ final_df.csv

Untar a tar.gz

tar -xvzf <tar file> -C <path to extract>
# tar -xvzf roberta-large-iter2-v2.tar.gz -C roberta-large-iter2-V2/

Downloading a file to a specific folder

wget -P </path/to/directory> <http://example.com/path/to/file>

Changing Ownership of a file

sudo chown <owner>:<group> <filepath> 
# sudo chown shamik:shamik abc.txt

List all available disks

sudo lsblk

Generally sda will be the boot disk and if you have only one disk, which can be attached then it will be sdb. However, if you have multiple disks, which can be attached it will be autoincremented to sdc,sdd,... In case you detach and then re-attach the disk then the disk name will be autoincremented too e.g., for sdb it will become sdc as there's only one disk.

To mount a particular disk

sudo mkdir /mnt/disks/<name of your choice>
sudo mount -o discard,defaults /dev/sdb /mnt/disks/<name of your choice>

Compress PDFs with Ghostscript

Adobe link

  1. Install Ghostscript with the command sudo apt install ghostscript.
  2. Once installed, you can use this command to compress PDF file sizes:
gs -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 -dPDFSETTINGS=/screen -dNOPAUSE -dQUIET -dBATCH -sOutputFile=output.pdf input.pdf
  • In the command, replace output.pdf and input.pdf with your chosen filenames. The dPDFSETTINGS is where you’ll choose the file size. Change the suffix =/screen to suit your needs:

  • -dPDFSETTINGS=/screen — Low quality and small size at 72dpi.

  • -dPDFSETTINGS=/ebook — Slightly better quality but also a larger file size at 150dpi.

  • -dPDFSETTINGS=/prepress — High quality and large size at 300 dpi.

  • -dPDFSETTINGS=/default — System chooses the best output, which can create larger PDF files.

  • Once you input your preferences, simply run the command. Your new compressed PDF will be saved in the same folder as the original.

Find a piece of text within multiple pdfs without opening them

find ./ -name '*.pdf' -exec sh -c 'pdftotext "{}" - | grep --with-filename --label="{}" --color "pattern"' \;

Replace part of a file name

Here it replaces any files starting with "model" with "man"

find . -name "model_wearing_pant_*.png" -exec bash -c 'mv "$0" "${0/model/man}"' {} \;

Delete a set of files containing a particular pattern

Pattern = women_wearing_pant_0001.png, women_wearing_pant_0002.png, women_wearing_pant_0003.png,…

Method 1

rm output/inpainting_clothes/women_wearing_pant_000[1-5]*

Method 2

ls | grep -i "women_wearing_pant_000[1-5]" | xargs rm

Transfer multiple files/dirs from cloud to local

rsync -vhrmPz -e "ssh -i ~/access_keys/ssh-key-2024-07-12.key" ubuntu@129.80.133.167:/mnt/vol1/ComfyUI/output/ ~/data/dynamic_fashion/

For only dry run

rsync -vhrmPzn -e "ssh -i ~/access_keys/ssh-key-2024-07-12.key" ubuntu@129.80.133.167:/mnt/vol1/ComfyUI/output/ ~/data/dynamic_fashion/

Shebang for bash

# !/bin/bash

To list all the drives

lsblk

To list all the UUID of storages

blkid

To always mount a drive in an vm instance automatically

There are two ways one is to find out the drive name or with the unique id. All mounted drives must be under /mnt. If there’s no existing folder then one has to be created before proceeding.

  1. Create a folder under /mnt
  2. Lookup the drive name with lsblk or the UUID with blkid
  3. vim /etc/fstab and add either of the following lines
    1. /dev/sdb /mnt/drive ext4 defaults 0 0
    2. UUID="6b4bdae1-9970-4388-9156-39bf60ac9855" /mnt/vol1 ext4 defaults,_netdev,nofail 0 2

Explanations

  1. It mounts the drive sdb to /mnt/drive and uses the defaults option for including read write access, automatic mounting etc. The first 0 is whether filesystem should be backed up and in this case no and the second 0 whether to check the filesystem be checked at boot time in this case no again.

  2. The drive is mounted with an UUID to /mnt/vol1 with defaults,_netdev(meaning network access is required to be mounted) and nofail(continue booting if the mounting fails). The first 0 is whether filesystem should be backed up and in this case no and the 2 whether to check the filesystem be checked at boot time in this case yes after the checking the root filesystem.

To not have to input sudo before every command for a folder

sudo chown -R ubuntu:ubuntu <folder path>

Configuring a default .cache folder in another path

echo 'export XDG_CACHE_HOME=/path/to/new/cache' >> ~/.bashrc
source ~/.bashrc

Saving a wget file in a directory

 wget --content-disposition <https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-Linux-x86_64.sh> -P downloads/

Check IP address

curl icanhazip.com
curl ifcfg.me
curl <https://ipinfo.io/ip>
curl  <http://myip.lunaproxy.io/>

TAR multiple files and folders all without their parent folder

tar -cvzf trial_tar.tar.gz -C ~/repos/ctailml/modules/surrounding_stories_kg/ batched_processing/ -C ~/repos/ctailml/modules/surrounding_stories_kg/ entity_resolution_with_text.pkl -C ~/repos/ctailml/modules/surrounding_stories_kg/ merged_entitiy_kb_iou_0_3.pkl -C ~/repos/ctailml/modules/surrounding_stories_kg/ unmerged_entities.pkl -C ~/repos/ctailml/modules/surrounding_stories_kg/src/time_inc/ reco_api.py -C ~/repos/ctailml/modules/surrounding_stories_kg/ data_download/

Terminal command to export all env vars to the current terminal

export $(grep -v '^#' .env | xargs)

Tar all git tracked files

git ls-files -z | tar -cvzf postures.tar.gz --null -T -

Clone this wiki locally