Why IaC Security Is Critical

In 2023, Palo Alto Networks documented that 65% of cloud breaches originates from IaC misconfigurations: public S3 buckets, security groups too permissive, logging disabled, encryption not configured. The feature devastating of these errors is that yes they scale: a Terraform module insecure used in 50 environments creates 50 vulnerabilities simultaneously.

The solution is it Shift Left Security: Move the controls security as early as possible in the workflow, before misconfigurations occur reach the cloud. In this article we integrate three complementary tools: Checkov for broad coverage, Trivy for finding vulnerabilities, and OPA for custom organizational policies.

The Three Tools and Their Roles

  • Checkov: 1000+ built-in policies for AWS/Azure/GCP/K8s. Specialized in IaC. Maintained by Bridgecrew (acquired by Palo Alto).
  • Trivy: Successor to tfsec (Aqua Security). Unified scanner: IaC, containers, dependencies, secrets. tfsec → Trivy migration completed in 2024.
  • OPA + contest test: Generic Policy Engine. Write policies in Rego for custom organizational rules (naming conventions, mandatory tags, permitted regions).

Checkov: Security Scanning with 1000+ Policies

Checkov parses Terraform (.tf) files statically, without running init or plan. Covers common vulnerabilities: unencrypted storage, unintended public access, logging disabled, IAM too permissive.

# Installazione
pip install checkov

# Oppure con Docker (raccomandato per CI)
docker pull bridgecrew/checkov:latest

# Scansione base di una directory Terraform
checkov -d ./environments/production --framework terraform

# Solo errori HIGH e CRITICAL (ignora LOW e MEDIUM)
checkov -d . --check HIGH,CRITICAL

# Output SARIF per integrare con GitHub Code Scanning
checkov -d . --output sarif --output-file checkov.sarif

# Skip di check specifici (con motivazione documentata)
checkov -d . --skip-check CKV_AWS_20,CKV_AWS_57
# CKV_AWS_20: bucket S3 public read - skippiamo solo se il bucket è intenzionalmente pubblico
# Esempio: codice Terraform con vulnerabilità e relativo fix

# VULNERABILE - Checkov alert: CKV_AWS_18, CKV_AWS_52, CKV_AWS_144
resource "aws_s3_bucket" "logs" {
  bucket = "my-app-logs"
  # MANCA: server_side_encryption_configuration
  # MANCA: versioning
  # MANCA: replication_configuration per disaster recovery
}

# SICURO - Tutti i check Checkov superati
resource "aws_s3_bucket" "logs" {
  bucket = "my-app-logs"
}

resource "aws_s3_bucket_server_side_encryption_configuration" "logs" {
  bucket = aws_s3_bucket.logs.id

  rule {
    apply_server_side_encryption_by_default {
      sse_algorithm = "aws:kms"  # CKV_AWS_19: usa KMS, non AES256
    }
    bucket_key_enabled = true
  }
}

resource "aws_s3_bucket_versioning" "logs" {
  bucket = aws_s3_bucket.logs.id
  versioning_configuration {
    status = "Enabled"  # CKV_AWS_21
  }
}

resource "aws_s3_bucket_public_access_block" "logs" {
  bucket = aws_s3_bucket.logs.id

  block_public_acls       = true  # CKV_AWS_53
  block_public_policy     = true  # CKV_AWS_54
  ignore_public_acls      = true  # CKV_AWS_55
  restrict_public_buckets = true  # CKV_AWS_56
}

resource "aws_s3_bucket_logging" "logs" {
  bucket = aws_s3_bucket.logs.id

  target_bucket = aws_s3_bucket.access_logs.id  # CKV_AWS_18
  target_prefix = "s3-access-logs/"
}
# checkov.yaml - Configurazione progetto (in root del repo)
# Skippa check non applicabili con motivazione documentata
skip-check:
  - CKV_AWS_144  # S3 cross-region replication: non richiesto per log bucket interni

# Controlla solo risorse dei moduli custom (escludi moduli registry)
directory:
  - modules/

# Severity minima da riportare
soft-fail-on: LOW,MEDIUM  # Non fallisce la pipeline per LOW/MEDIUM

# Abilita output colorato
compact: false

Trivy: Unified IaC Scanner

Trivy is the successor to tfsec (migration completed in 2024). The advantage over Checkov is that Trivy also scans container images, application dependencies and secrets in the same pipeline, reducing the number of tools.

# Installazione
brew install trivy          # macOS
# oppure:
curl -sfL https://raw.githubusercontent.com/aquasecurity/trivy/main/contrib/install.sh | sh

# Scansione Terraform
trivy config ./environments/production

# Con severity filter
trivy config --severity HIGH,CRITICAL ./

# Output JSON per parsing
trivy config --format json --output trivy-results.json ./

# Template Terraform specifico con maggiore dettaglio
trivy config --tf-vars ./environments/production/terraform.tfvars ./environments/production

# Scansione tutto il repo: IaC + dipendenze + segreti
trivy fs ./
# Esempio output Trivy (semplificato)
# trivy config ./environments/production

2024-01-15T10:30:00Z INFO Misconfiguration Types: terraform
environments/production/main.tf (terraform)

HIGH: Resource 'aws_security_group.app' has a wide open egress rule
------------------------------------------
Via: environments/production/main.tf:45

    egress {
      from_port   = 0
      to_port     = 0
      protocol    = "-1"
      cidr_blocks = ["0.0.0.0/0"]  # VULN: egress completamente aperto
    }

Fix: Limita l'egress solo alle porte e destinazioni necessarie.

MEDIUM: Resource 'aws_db_instance.main' has no deletion protection
Via: environments/production/main.tf:78
    deletion_protection = false  # VULN: in production deve essere true
# .trivyignore - Ignora check specifici con motivazione
# Formato: CHECK_ID # Commento obbligatorio con motivazione

AVD-AWS-0025 # S3: CORS permissivo necessario per CDN pubblica
AVD-AWS-0168 # CloudFront: TLS 1.0 permesso per client legacy pre-2022

OPA and conftest: Policy Custom with Rego

Open Policy Agent (OPA) is a generic policy engine used by Kubernetes, Istio and Terraform. conftest it is the CLI that applies OPA policy to configuration files (including Terraform plan JSON). With OPA you can implement organizational policies that do not exist in Checkov or Trivy: naming convention, mandatory tags, permitted regions, cost limits.

# Installazione conftest
brew install conftest

# oppure:
curl -L https://github.com/open-policy-agent/conftest/releases/download/v0.49.0/conftest_0.49.0_linux_amd64.tar.gz | tar xz
# policy/terraform/naming.rego
# Policy: tutte le risorse AWS devono seguire la naming convention aziendale

package main

import future.keywords.in

# Regex per naming convention: {team}-{environment}-{name}
naming_pattern := `^[a-z]+-(?:dev|staging|production)-[a-z0-9-]+
  
  06 - IaC Security: Checkov, Trivy and OPA Policy-as-Code | Federico Calò
  

  
  
  
  
  
  
  
  
  
  

  
  

  
  
  

  

  
  
  
  
  
  
  

  
  
  
  
  
  
  

  
  
  

  
  

  
  
  
  
  

  
  

  
  
  

  
  
  
  

  
  

  
  
  
  

  
  

  
  

  
  


  
  

  06 - IaC Security: Checkov, Trivy and OPA Policy-as-Code | Federico Calò
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  


  Skip to main content

Hi! I'm

Federico Calò

Software Developer | Technical Writer

I create modern web applications and custom digital tools to help businesses grow through technological innovation. My passion is combining computer science and economics to generate real value.

About Me

My passion for computer science was born at the Technical Commercial Institute of Maglie, where I discovered the power of programming and the fascination of creating digital solutions. From the start, I understood that computer science was not just code, but an extraordinary tool for turning ideas into reality.

During my studies in Business Information Systems, I began to interweave computer science and economics, understanding how technology can be the engine of growth for any business. This vision accompanied me to the University of Bari, where I obtained my degree in Computer Science, deepening my technical skills and passion for software development.

Today I put this experience at the service of businesses, professionals and startups, creating tailor-made digital solutions that automate processes, optimize resources and open new business opportunities. Because true innovation begins when technology meets the real needs of people.

My Skills

Data Analysis & Predictive Models

I transform data into strategic insights with in-depth analysis and predictive models for informed decisions

Process Automation

I create custom tools that automate repetitive operations and free up time for value-added activities

Custom Systems

I develop tailor-made software systems, from platform integrations to customized dashboards

const federico = {
  nome: "Federico Calò",
  ruolo: "Sviluppatore Software",
  città: "Bari, Italia",
  missione: "Aiutare attraverso l'informatica",
  passioni: [
    "Codice Pulito",
    "Innovazione",
    "Crescita Continua"
  ]
};

La Mia Missione

Credo fermamente che l'informatica sia lo strumento più potente per trasformare le idee in realtà e migliorare la vita delle persone.

🚀

Democratizzare la Tecnologia

La mia missione è rendere l'informatica accessibile a tutti: dalle piccole imprese locali alle startup innovative, fino ai professionisti che vogliono digitalizzare la propria attività. Ogni realtà merita di sfruttare le potenzialità del digitale.

💡

Unire Informatica ed Economia

Non è solo questione di scrivere codice: è capire come la tecnologia possa generare valore reale. Intrecciando competenze informatiche e visione economica, aiuto le attività a crescere, ottimizzare processi e raggiungere nuovi traguardi di efficienza e redditività.

🎯

Creare Soluzioni su Misura

Ogni attività è unica, e così devono esserlo le soluzioni. Sviluppo strumenti personalizzati che rispondono alle esigenze specifiche di ciascun cliente, automatizzando processi ripetitivi e liberando tempo per ciò che conta davvero: far crescere il business.

Trasforma la Tua Attività con la Tecnologia

Che tu gestisca un negozio, uno studio professionale o un'azienda, posso aiutarti a sfruttare le potenzialità dell'informatica per lavorare meglio, più velocemente e in modo più intelligente.

Parliamone Insieme →

Education & Skills

My academic journey and the technologies I master

🏆 Professional Certifications

8 certifications earned

New
Reinvention With Agentic AI Learning Program
View

Reinvention With Agentic AI Learning Program

Anthropic

December 2024
New
Agentic AI Fluency
View

Agentic AI Fluency

Anthropic

December 2024
New
AI Fluency for Students
View

AI Fluency for Students

Anthropic

December 2024
New
AI Fluency: Framework and Foundations
View

AI Fluency: Framework and Foundations

Anthropic

December 2024
New
Claude with the Anthropic API
View

Claude with the Anthropic API

Anthropic

December 2024
Master SQL
View

Master SQL

RoadMap.sh

Novembre 2024
Oracle Certified Foundations Associate
View

Oracle Certified Foundations Associate

Oracle

October 2024
People Leadership Credential
View

People Leadership Credential

Connect

Settembre 2024

💻 Languages & Technologies

Java
🐍Python
📜JavaScript
🅰️Angular
⚛️React
🔷TypeScript
🗄️SQL
🐘PHP
🎨CSS/SCSS
🔧Node.js
🐳Docker
🌿Git
💼
12/2024 - Presente

Custom Software Engineering Analyst

Accenture

Bari, Puglia, Italia · Ibrida Analisi e sviluppo di sistemi informatici attraverso l'utilizzo di Java e Quarkus in Health and Public Sector. Formazione continua su tecnologie moderne per la creazione di soluzioni software personalizzate ed efficienti e sugli agenti.

💼
06/2022 - 12/2024

Analista software e Back End Developer Associate Consultant

Links Management and Technology SpA

Esperienza nell'analisi di sistemi software as-is e flussi ETL utilizzando PowerCenter. Formazione completata su Spring Boot per lo sviluppo di applicazioni backend moderne e scalabili. Sviluppatore Backend specializzato in Spring Boot, con esperienza in progettazione di database, analisi, sviluppo e testing dei task assegnati.

💼
02/2021 - 10/2021

Programmatore software

Adesso.it (prima era WebScience srl)

Esperienza nell'analisi AS-IS e TO-BE, evoluzioni SEO ed evoluzioni website per migliorare le performance e l'engagement degli utenti.

🎓
2018 - 2025

Laurea in Informatica

Università degli Studi di Bari Aldo Moro

Bachelor's degree in Computer Science, focusing on software engineering, algorithms, and modern development practices.

📚
2013 - 2018

Diploma - Sistemi Informativi Aziendali

Istituto Tecnico Commerciale di Maglie

Technical diploma specializing in Business Information Systems, combining IT knowledge with business management.

Contattami

Hai un progetto in mente? Parliamone! Compila il form qui sotto e ti risponderò al più presto.

* Campi obbligatori. I tuoi dati saranno utilizzati solo per rispondere alla tua richiesta.

# Risorse che richiedono naming convention resources_requiring_naming := { "aws_s3_bucket", "aws_rds_instance", "aws_elasticache_cluster", "aws_eks_cluster", } # Deny se il nome non segue la convention deny[msg] { resource := input.resource[resource_type][resource_name] resource_type in resources_requiring_naming name := resource.config.bucket[0].constant_value # per S3 not regex.match(naming_pattern, name) msg := sprintf( "POLICY VIOLATION: %s '%s' ha nome '%s' che non rispetta la naming convention '%s'", [resource_type, resource_name, name, naming_pattern] ) }
# policy/terraform/tags.rego
# Policy: tag obbligatori su tutte le risorse AWS

package main

# Tag obbligatori per compliance e cost allocation
required_tags := {
  "Team",
  "Environment",
  "CostCenter",
  "ManagedBy",
}

# Risorse che devono avere tag
taggable_resources := {
  "aws_instance",
  "aws_s3_bucket",
  "aws_rds_instance",
  "aws_eks_cluster",
  "aws_lambda_function",
}

deny[msg] {
  resource := input.resource[resource_type][resource_name]
  resource_type in taggable_resources

  # Ottieni i tag dalla configurazione Terraform (formato piano JSON)
  tags := resource.config.tags[0].constant_value

  # Verifica che ogni tag obbligatorio sia presente
  required_tag := required_tags[_]
  not tags[required_tag]

  msg := sprintf(
    "POLICY VIOLATION: %s '%s' manca il tag obbligatorio '%s'",
    [resource_type, resource_name, required_tag]
  )
}

# Policy: il tag 'Environment' deve avere un valore valido
deny[msg] {
  resource := input.resource[resource_type][resource_name]
  resource_type in taggable_resources

  tags := resource.config.tags[0].constant_value
  env := tags.Environment
  not env in {"dev", "staging", "production"}

  msg := sprintf(
    "POLICY VIOLATION: %s '%s' tag Environment='%s' non valido. Deve essere dev|staging|production",
    [resource_type, resource_name, env]
  )
}
# policy/terraform/regions.rego
# Policy: deploy permesso solo nelle regioni EU (GDPR compliance)

package main

allowed_regions := {"eu-west-1", "eu-west-2", "eu-central-1", "eu-south-1"}

deny[msg] {
  resource := input.resource[resource_type][resource_name]
  region := resource.config.region[0].constant_value

  not region in allowed_regions

  msg := sprintf(
    "POLICY VIOLATION: %s '%s' usa la regione '%s' non permessa per compliance GDPR. Regioni ammesse: %v",
    [resource_type, resource_name, region, allowed_regions]
  )
}
# Uso di conftest con piano Terraform in JSON

# 1. Genera il piano Terraform in formato JSON
terraform init
terraform plan -out=tfplan
terraform show -json tfplan > tfplan.json

# 2. Esegui conftest con le policy
conftest test tfplan.json --policy policy/terraform/

# Output esempio:
# FAIL - tfplan.json - main - POLICY VIOLATION: aws_s3_bucket 'logs' manca il tag obbligatorio 'CostCenter'
# FAIL - tfplan.json - main - POLICY VIOLATION: aws_s3_bucket 'data' usa la regione 'us-east-1' non permessa per compliance GDPR
# 2 tests, 2 failures

# Test con output strutturato JSON
conftest test tfplan.json --policy policy/ --output json

# Test con namespace specifico
conftest test tfplan.json --policy policy/ --namespace main

CI Pipeline Integrated with All Tools

# .github/workflows/security-scan.yml
name: IaC Security Scan

on:
  pull_request:
    paths:
      - '**.tf'
      - '**.tfvars'

jobs:
  checkov:
    name: Checkov Scan
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Run Checkov
        uses: bridgecrewio/checkov-action@v12
        with:
          directory: .
          framework: terraform
          output_format: sarif
          output_file_path: reports/checkov.sarif
          soft_fail: true  # Non blocca la pipeline su LOW/MEDIUM

      - name: Upload SARIF to GitHub Code Scanning
        uses: github/codeql-action/upload-sarif@v3
        if: always()
        with:
          sarif_file: reports/checkov.sarif

  trivy:
    name: Trivy Scan
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Run Trivy
        uses: aquasecurity/trivy-action@master
        with:
          scan-type: 'config'
          scan-ref: '.'
          format: 'sarif'
          output: 'trivy.sarif'
          severity: 'HIGH,CRITICAL'
          exit-code: '1'   # Blocca la pipeline su HIGH/CRITICAL

      - name: Upload Trivy SARIF
        uses: github/codeql-action/upload-sarif@v3
        if: always()
        with:
          sarif_file: trivy.sarif

  opa-policy:
    name: OPA Policy Check
    runs-on: ubuntu-latest
    needs: [checkov, trivy]

    steps:
      - uses: actions/checkout@v4

      - name: Setup Terraform
        uses: hashicorp/setup-terraform@v3
        with:
          terraform_version: '1.7.5'

      - name: Configure AWS Credentials
        uses: aws-actions/configure-aws-credentials@v4
        with:
          role-to-assume: ${{ secrets.TERRAFORM_PLAN_ROLE }}
          aws-region: eu-west-1

      - name: Generate Terraform Plan JSON
        run: |
          cd environments/production
          terraform init -no-color
          terraform plan -out=tfplan -no-color
          terraform show -json tfplan > ../../tfplan.json

      - name: Install conftest
        run: |
          curl -L https://github.com/open-policy-agent/conftest/releases/download/v0.49.0/conftest_0.49.0_linux_amd64.tar.gz | tar xz
          sudo mv conftest /usr/local/bin/

      - name: Run OPA Policies
        run: |
          conftest test tfplan.json --policy policy/terraform/ --output json | tee opa-results.json
          # Fallisce se ci sono violation
          jq -e '.[] | select(.failures | length > 0)' opa-results.json && exit 1 || exit 0

Coverage Matrix: Which Tools for Which Risks

Risk Checkov Trivy OPA
S3 public access Yes (CKV_AWS_20) Si Custom
Encryption at rest missing Yes (many checks) Si No
Security group too permissive Si Si No
Missing mandatory tags Partial No Yes (custom Rego)
Naming convention No No Yes (custom Rego)
Regions Not Allowed (GDPR) No No Yes (custom Rego)
Secrets in the code Partial Yes (secret scanning) No
Container vulnerabilities No Si No
Vulnerable addictions No Si No

Conclusions and Next Steps

The combination of Checkov (built-in wide coverage), Trivy (unified scanner) and OPA (custom organizational policy) forms a layered defense that covers most part of the security risks in IaC. The key principle is that these tools must block PR merging, not be dismissable warnings.

Next Articles in the Series

  • Article 7: Terraform Multi-Cloud — AWS + Azure + GCP with Shared Modules: Build an abstraction layer to manage multiple providers with a unified interface.
  • Article 8: GitOps for Terraform — Flux TF Controller e Drift Detection: Brings Terraform into the GitOps paradigm with continuous reconciliation.