Skip to content

This repo covers Terraform (Infrastructure as Code) with LABs using AWS and AWS Sample Projects: Resources, Variables, Meta Arguments, Provisioners, Dynamic Blocks, Modules, Provisioning AWS Resources (EC2, EBS, EFS, VPC, IAM Policies, Roles, ECS, ECR, Fargate, EKS, Lambda, API-Gateway, ELB, S3, etc.

License

Notifications You must be signed in to change notification settings

omerbsezer/Fast-Terraform

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Fast-Terraform (with AWS)

This repo covers Terraform with Hands-on LABs and Samples using AWS (comprehensive, but simple):

  • Resources, Data Sources, Variables, Meta Arguments, Provisioners, Dynamic Blocks, Modules, Workspaces, Templates, Remote State.
  • Provisioning AWS Components (EC2, EBS, EFS, IAM Roles, IAM Policies, Key-Pairs, VPC with Network Components, Lambda, ECR, ECS with Fargate, EKS with Managed Nodes, ASG, ELB, API Gateway, S3, CloudFront CodeCommit, CodePipeline, CodeBuild, CodeDeploy), use cases and details. Possible usage scenarios are aimed to update over time.

Why was this repo created?

  • Shows Terraform details in short with simple, clean demos and Hands-on LABs
  • Shows Terraform AWS Hands-on Samples, Use Cases

Keywords: Terraform, Infrastructure as Code, AWS, Cloud Provisioning

Quick Look (How-To): Terraform Hands-on LABs

These LABs focus on Terraform features, help to learn Terraform:

Quick Look (How-To): AWS Terraform Hands-on Samples

These samples focus on how to create and use AWS components (EC2, EBS, EFS, IAM Roles, IAM Policies, Key-Pairs, VPC with Network Components, Lambda, ECR, ECS with Fargate, EKS with Managed Nodes, ASG, ELB, API Gateway, S3, CloudFront, CodeCommit, CodePipeline, CodeBuild, CodeDeploy) with Terraform:

Table of Contents

Motivation

Why should we use / learn Terraform?

  • Terraform is cloud-agnostic and popular tool to create/provision Cloud Infrastructure resources/objects (e.g. Virtual Private Cloud, Virtual Machines, Lambda, etc.)
    • Manage any infrastructure
    • Similar to Native Infrastructure as Code (IaC): CloudFormation (AWS), Resource Manager (Azure), Google Cloud Deployment Manager (Google Cloud)
  • It is free, open source (https://github.com/hashicorp/terraform) and has a large community with enterprise support options.
  • Commands, tasks, codes turn into the IaC.
    • With IaC, tasks is savable, versionable, repetable and testable.
    • With IaC, desired configuration is defined as 'Declerative Way'.
  • Agentless: Terraform doesn’t require any software to be installed on the managed infrastructure
  • It has well-designed documentation:
  • Terraform uses a modular structure.
  • Terraform tracks your infrastructure with TF state file. image(ref: Redis)

What is Terraform?

How Terraform Works?

  • Terraform works with different providers (AWS, Google CLoud, Azure, Docker, K8s, etc.)

  • After creating Terraform Files (tf), terraform commands:

    • init: downloads the required executable apps dependent on providers.
    • validate: confirms the tf files.
    • plan: dry-run for the infrastructure, not actually running/provisioning the infrastructure
    • apply: runs/provisions the infrastructure
    • destroy: deletes the infrastructure
  • Main Commands:

terraform init
terraform validate
terraform plan            # ask for confirmation (yes/no), after running command
terraform apply           # ask for confirmation (yes/no), after running command
terraform destroy         # ask for confirmation (yes/no), after running command
  • Command Variants:
terraform plan --var-file="terraform-dev.tfvars"            # specific variable files
terraform apply -auto-approve                               # no ask for confirmation
terraform apply --var-file="terraform-prod.tfvars"          # specific variable files
terraform destroy --var-file="terraform-prod.tfvars"        # specific variable files
  • Terraform Command Structure:

    image

  • Terraform Workflow:

    image

  • TF state file stores the latest status of the infrastructure after running "apply" command.

  • TF state file deletes the status of the infrastructure after running "destroy" command.

  • TF state files are stored:

    • on local PC
    • on remote cloud (AWS S3, Terraform Cloud)
  • Please have a look LABs and SAMPLEs to learn how Terraform works in real scenarios.

Terraform File Components

  • Terraform file has different components to define infrastructure for different purposes.
    • Providers,
    • Resources,
    • Variables,
    • Values (locals, outputs),
    • Meta Argurments (for, for_each, map, depends_on, life_cycle),
    • Dynamic Blocks,
    • Data Sources,
    • Provisioners,
    • Workspaces,
    • Modules,
    • Templates.

Providers

Resources

  • Resources are used to define for different cloud components and objects (e.g. EC2 instances, VPC, VPC Compoenents: Router Tables, Subnets, IGW, .., Lambda, API Gateway, S3 Buckets, etc.).

  • To learn the details, features of the cloud components, you should know how the cloud works, which components cloud have, how to configure the cloud components.

  • Syntax:

    • resource <AWS_Object> <User_Defined_Variable_Name_With_Underscore> {}

      • e.g. resource "aws_instance" "instance" {}
      • e.g. resource "aws_vpc" "my_vpc" {}
      • e.g. resource "aws_subnet" "public" {}
      • e.g. resource "aws_security_group" "allow_ssh" {}

      image

  • Important part is to check the usage of the resources (shows which arguments are optional, or required) from Terraform Registry page by searching the "Object" terms like "instance", "vpc", "security groups"

    image

  • There are different parts:

    • Argument References (inputs) (some parts are optional, or required)
    • Attributes References (outputs)
    • Example code snippet to show how it uses
    • Others (e.g. timeouts, imports)
  • Go to LAB to learn resources:

Variables (tfvar)

  • Variables help to avoid hard coding on the infrastructure code.

  • The Terraform language uses the following types for its values:

    • string: a sequence of Unicode characters representing some text, like "hello".
    • number: a numeric value. The number type can represent both whole numbers like 15 and fractional values like 6.283185.
    • bool: a boolean value, either true or false. bool values can be used in conditional logic.
    • list (or tuple): a sequence of values, like ["one", "two"]. Elements in a list or tuple are identified by consecutive whole numbers, starting with zero.
    • map (or object): a group of values identified by named labels, like {name = "Mabel", age = 52}.
    • Strings, numbers, and bools are sometimes called primitive types. Lists/tuples and maps/objects are sometimes called complex types, structural types, or collection types.
  • Normally, if you define variables, after running "terraform apply" command, on the terminal, stdout requests from the user to enter variables.

  • But, if the "tfvar" file is defined, variables in the "tfvar" file are entered automatically in the corresponding variable fields.

    image

  • Tfvar files for development ("DEV") environment:

    image

  • Tfvar files for production ("PROD") environment:

    image

  • Go to LAB to learn variables and tfvar file, and provisioning EC2 for different environments:

Values (Locals, Outputs)

  • "Locals" are also the variables that are mostly used as place-holder variables.

    image

  • "Outputs" are used to put the cloud objects' information (e.g. public IP, DNS, detailed info) out as stdout.

    image

  • "Outputs" after running "terraform apply" command on the terminal stdout:

    image

  • Go to LAB to learn more about variables, locals, outputs and provisioning EC2:

Meta Arguments

Dynamic Blocks

Data Sources

  • "Data Sources" helps to retrieve/fetch/get data/information from previously created/existed cloud objects/resources.

  • In the example below:

    • "filter" keyword is used to select/filter the existed objects (reources, instances, etc.)
    • "depends_on" keyword provides to run the data block after resource created.

    image

  • Go to LAB to learn:

Provisioners

  • "Provisioners" provides to run any commands on the remote instance/virtual machine, or on the local machine.

  • "Provisioners" in the resource block runs only once while creating the resource on remote instance. If the resource is created/provisioned before, "provisioner" block in the resource block doesn't run again.

  • With "null_resource":

    • Without creating any resource,
    • Without depending any resource,
    • Any commands can be run.
  • Provisioners in the "null_resource" run multiple times and it doesn't depend on the resource.

  • With provisioner "file", on the remote instance, new file can be created

  • With provisioner "remote-exec", on the remote instance, any command can be run

  • With provisioner "local-exec", on the local PC, any command can be run on any shell (bash, powershell)

    image

  • Go to LAB to learn about different provisioners:

Modules

Workspaces

  • With "Workspaces":

    • a parallel, distinct copy of your infrastructure which you can test and verify in the development, test, and staging,
    • like git, you are working on different workspaces (like branch),
    • single code but different workspaces,
    • it creates multiple state files on different workspace directories.
  • Workspace commands:

terraform workspace help                       # help for workspace commands
terraform workspace new [WorkspaceName]        # create new workspace
terraform workspace select [WorkspaceName]     # change/select another workspace
terraform workspace show                       # show current workspace
terraform workspace list                       # list all workspaces
terraform workspace delete [WorkspaceName]     # delete existed workspace

Templates

Backend and Remote States

Terraform Best Practices

  • Don't change/edit anything on state file manually. Manipulate state file only through TF commands (e.g. terraform apply, terraform state).
  • Use remote state file to share the file with other users. Keep state file on the Cloud (S3, Terraform Cloud, etc.)
  • To prevent concurrent changes, state locking is important. Hence concurrent change from multiple user can be avoided.
    • S3 supports state locking and consistency via DynamoDB.
  • Backing up state file is also important to save the status. S3 enables versioning. Versioning state file can provide you backing up the state file.
  • If you use multiple environment (dev, test, staging, production), use 1 state file per environment. Terraform workspace provide multiple state files for different environments.
  • Use Git repositories (Github, Gitlab) to host TF codes to share other users.
  • Behave your Infrastructure code as like your application code. Create CI pipeline/process for your TF Code (review tf code, run automated tests). This will shift your infrastructure code high quality.
  • Execute Terraform only in an automated build, CD pipeline/process. This helps to run code automatically and run from one/single place.
  • For naming conventions: https://www.terraform-best-practices.com/naming

AWS Terraform Hands-on Samples

SAMPLE-01: EC2s (Windows 2019 Server, Ubuntu 20.04), VPC, Key-Pairs for SSH, RDP connections

SAMPLE-02: Provisioning Lambda Function, API Gateway and Reaching HTML Page in Python Code From Browsers

SAMPLE-03: EBS (Elastic Block Storage: HDD, SDD) and EFS (Elastic File System: NFS) Configuration with EC2s (Ubuntu and Windows Instances)

SAMPLE-04: Provisioning ECR (Elastic Container Repository), Pushing Image to ECR, Provisioning ECS (Elastic Container Service), VPC (Virtual Private Cloud), ELB (Elastic Load Balancer), ECS Tasks and Service on Fargate Cluster

SAMPLE-05: Provisioning ECR, Lambda Function and API Gateway to run Flask App Container on Lambda

SAMPLE-06: Provisioning EKS (Elastic Kubernetes Service) with Managed Nodes using Blueprint and Modules

SAMPLE-07: CI/CD on AWS => Provisioning CodeCommit and CodePipeline, Triggering CodeBuild and CodeDeploy, Running on Lambda Container

SAMPLE-08: Provisioning S3 and CloudFront to serve Static Web Site

SAMPLE-09: Running Gitlab Server using Docker on Local Machine and Making Connection to Provisioned Gitlab Runner on EC2 in Home Internet without Using VPN

SAMPLE-10: Implementing MLOps Pipeline using GitHub, AWS CodePipeline, AWS CodeBuild, AWS CodeDeploy, and AWS Sagemaker (Endpoint)

Details

  • To validate the Terraform files:
    • "terraform validate"
  • For dry-run:
    • "terraform plan"
  • For formatting:
    • "terraform fmt"
  • For debugging:
    • Bash: export TF_LOG="DEBUG"
    • PowerShell: $env:TF_LOG="DEBUG"
  • For debug logging:
    • Bash: export TF_LOG_PATH="tmp/terraform.log"
    • PowerShell: $env:TF_LOG_PATH="C:\tmp\terraform.log"

Terraform Cheatsheet

Other Useful Resources Related Terraform

References

About

This repo covers Terraform (Infrastructure as Code) with LABs using AWS and AWS Sample Projects: Resources, Variables, Meta Arguments, Provisioners, Dynamic Blocks, Modules, Provisioning AWS Resources (EC2, EBS, EFS, VPC, IAM Policies, Roles, ECS, ECR, Fargate, EKS, Lambda, API-Gateway, ELB, S3, etc.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages