Creo applicazioni web moderne e strumenti digitali personalizzati per aiutare le attività a crescere attraverso l'innovazione tecnologica. La mia passione è unire informatica ed economia per generare valore reale.
La mia passione per l'informatica è nata tra i banchi dell'Istituto Tecnico Commerciale di Maglie, dove ho scoperto il potere della programmazione e il fascino di creare soluzioni digitali. Fin da subito, ho capito che l'informatica non era solo codice, ma uno strumento straordinario per trasformare idee in realtà.
Durante gli studi superiori in Sistemi Informativi Aziendali, ho iniziato a intrecciare informatica ed economia, comprendendo come la tecnologia possa essere il motore della crescita per qualsiasi attività. Questa visione mi ha accompagnato all'Università degli Studi di Bari, dove ho conseguito la Laurea in Informatica, approfondendo le mie competenze tecniche e la mia passione per lo sviluppo software.
Oggi metto questa esperienza al servizio di imprese, professionisti e startup, creando soluzioni digitali su misura che automatizzano processi, ottimizzano risorse e aprono nuove opportunità di business. Perché la vera innovazione inizia quando la tecnologia incontra le esigenze reali delle persone.
Le Mie Competenze
Analisi Dati & Modelli Previsionali
Trasformo i dati in insights strategici con analisi approfondite e modelli predittivi per decisioni informate
Automazione Processi
Creo strumenti personalizzati che automatizzano operazioni ripetitive e liberano tempo per attività a valore aggiunto
Sistemi Custom
Sviluppo sistemi software su misura, dalle integrazioni tra piattaforme alle dashboard personalizzate
Credo fermamente che l'informatica sia lo strumento più potente per trasformare le idee in realtà e migliorare la vita delle persone.
🚀
Democratizzare la Tecnologia
La mia missione è rendere l'informatica accessibile a tutti: dalle piccole imprese locali alle startup innovative, fino ai professionisti che vogliono digitalizzare la propria attività. Ogni realtà merita di sfruttare le potenzialità del digitale.
💡
Unire Informatica ed Economia
Non è solo questione di scrivere codice: è capire come la tecnologia possa generare valore reale. Intrecciando competenze informatiche e visione economica, aiuto le attività a crescere, ottimizzare processi e raggiungere nuovi traguardi di efficienza e redditività.
🎯
Creare Soluzioni su Misura
Ogni attività è unica, e così devono esserlo le soluzioni. Sviluppo strumenti personalizzati che rispondono alle esigenze specifiche di ciascun cliente, automatizzando processi ripetitivi e liberando tempo per ciò che conta davvero: far crescere il business.
Trasforma la Tua Attività con la Tecnologia
Che tu gestisca un negozio, uno studio professionale o un'azienda, posso aiutarti a sfruttare le potenzialità dell'informatica per lavorare meglio, più velocemente e in modo più intelligente.
Il mio percorso accademico e le tecnologie che padroneggio
Certificazioni Professionali
8 certificazioni conseguite
Nuovo
Visualizza
Reinvention With Agentic AI Learning Program
Anthropic
Dicembre 2024
Nuovo
Visualizza
Agentic AI Fluency
Anthropic
Dicembre 2024
Nuovo
Visualizza
AI Fluency for Students
Anthropic
Dicembre 2024
Nuovo
Visualizza
AI Fluency: Framework and Foundations
Anthropic
Dicembre 2024
Nuovo
Visualizza
Claude with the Anthropic API
Anthropic
Dicembre 2024
Visualizza
Master SQL
RoadMap.sh
Novembre 2024
Visualizza
Oracle Certified Foundations Associate
Oracle
Ottobre 2024
Visualizza
People Leadership Credential
Connect
Settembre 2024
💻 Linguaggi & Tecnologie
☕Java
🐍Python
📜JavaScript
🅰️Angular
⚛️React
🔷TypeScript
🗄️SQL
🐘PHP
🎨CSS/SCSS
🔧Node.js
🐳Docker
🌿Git
💼
12/2024 - Presente
Custom Software Engineering Analyst
Accenture
Bari, Puglia, Italia · Ibrida
Analisi e sviluppo di sistemi informatici attraverso l'utilizzo di Java e Quarkus in Health and Public Sector. Formazione continua su tecnologie moderne per la creazione di soluzioni software personalizzate ed efficienti e sugli agenti.
💼
06/2022 - 12/2024
Analista software e Back End Developer Associate Consultant
Links Management and Technology SpA
Esperienza nell'analisi di sistemi software as-is e flussi ETL utilizzando PowerCenter. Formazione completata su Spring Boot per lo sviluppo di applicazioni backend moderne e scalabili. Sviluppatore Backend specializzato in Spring Boot, con esperienza in progettazione di database, analisi, sviluppo e testing dei task assegnati.
💼
02/2021 - 10/2021
Programmatore software
Adesso.it (prima era WebScience srl)
Esperienza nell'analisi AS-IS e TO-BE, evoluzioni SEO ed evoluzioni website per migliorare le performance e l'engagement degli utenti.
🎓
2018 - 2025
Laurea in Informatica
Università degli Studi di Bari Aldo Moro
Bachelor's degree in Computer Science, focusing on software engineering, algorithms, and modern development practices.
📚
2013 - 2018
Diploma - Sistemi Informativi Aziendali
Istituto Tecnico Commerciale di Maglie
Technical diploma specializing in Business Information Systems, combining IT knowledge with business management.
Contattami
Hai un progetto in mente? Parliamone! Compila il form qui sotto e ti risponderò al più presto.
* Campi obbligatori. I tuoi dati saranno utilizzati solo per rispondere alla tua richiesta.
Introduction: Why Green Software Is a Necessity
The ICT sector accounts for approximately 2-4% of global CO₂ emissions,
a share comparable to the aviation industry. Global data centers consumed around
460 TWh in 2022 and, according to the International Energy Agency (IEA), could
exceed 1,000 TWh by 2026. In the United States alone, data centers already consume
over 4% of national electricity, with projections suggesting a possible 12% by 2030.
Against this backdrop, the Green Software Foundation (GSF) — a consortium founded in 2021
by Accenture, GitHub, Microsoft, and ThoughtWorks under the Linux Foundation — has defined
a systematic framework to reduce the environmental impact of software. This is not merely about optimization:
it is a paradigm shift that introduces carbon efficiency as a software quality metric
on par with performance, security, and maintainability.
In this article we will explore in depth the 8 principles of Green Software Engineering,
the SCI (Software Carbon Intensity) specification ratified as ISO/IEC 21031:2024,
and the practical tools to measure and reduce the emissions of our code.
What You Will Learn
The 8 core principles of the Green Software Foundation
The SCI formula and how to calculate software carbon intensity
Practical strategies for Carbon Efficiency, Energy Efficiency, and Carbon Awareness
Hardware Efficiency and the concept of embodied carbon
Green code patterns with concrete examples in Python, TypeScript, and Java
Climate commitments: the difference between neutralization, offsetting, and abatement
The Green Software Foundation: Mission and Structure
The Green Software Foundation (GSF) is a non-profit organization with the mission of
"creating a trusted ecosystem of people, standards, tools, and best practices for green software".
With over 80 member organizations — including Intel, Goldman Sachs, NTT Data, Avanade,
UBS, and Globant — the GSF operates through several working groups.
Green Software Foundation Structure
Working Group
Objective
Main Outputs
Standards
Define specifications and metrics
SCI Specification (ISO/IEC 21031), SCI for AI
Open Source
Develop tools and SDKs
Carbon Aware SDK, Impact Framework
Community
Training and outreach
Green Software Practitioner Certification
Policy
Influence regulations
SOFT Framework (formerly TOSS), ESG guidelines
In 2025, the GSF reached a significant milestone with the ratification of the SOFT Framework
(Sustainable Organizational Framework for Technology), led by Pindy Bhullar with support from Microsoft.
Four global organizations are already piloting the framework, addressing challenges such as data gaps,
tool integration decisions, and securing organizational buy-in.
The 8 Principles of Green Software Engineering
The principles of green software are not optional suggestions: they are a design philosophy
that influences every decision, from architecture to infrastructure choices. Let us examine each in detail.
Principles Overview
#
Principle
Focus
Key Metric
1
Carbon Efficiency
Emit less carbon per unit of work
gCO₂eq per operation
2
Energy Efficiency
Use less energy for the same function
kWh per transaction
3
Carbon Awareness
Adapt to the grid's carbon intensity
gCO₂eq/kWh of the grid
4
Hardware Efficiency
Maximize hardware utilization and lifespan
Embodied carbon (gCO₂eq)
5
Measurement
Measure and track emissions
SCI Score
6
Climate Commitments
Corporate climate pledges
Net Zero, Carbon Neutral
7
Networking
Reduce network traffic
Bytes transferred per operation
8
Demand Shaping
Shape demand based on available energy
Load during low-carbon periods
Principle 1: Carbon Efficiency
Carbon Efficiency means emitting the smallest possible amount of carbon
per unit of work. It is the founding principle: everything else derives from it. A
carbon-efficient software application is not necessarily the fastest or cheapest, but it is
the one that produces the least environmental impact for the value it generates.
The analogy is vehicle fuel efficiency: we care not only about how far a car travels,
but how many grams of CO₂ it emits per kilometer. Similarly, green software measures
grams of CO₂ equivalent per API call, per user served, per completed transaction.
The Principle in Action
If two implementations of a feature produce the same result, but one consumes
50% less energy, the second is more carbon-efficient — even if
it takes a few extra milliseconds. In a world where data centers consume hundreds of TWh,
these savings accumulate exponentially.
Practical Strategies for Carbon Efficiency
1. Algorithmic Optimization
The choice of algorithm has a greater energy impact than the choice of programming language.
An O(n log n) algorithm is not just faster than an O(n²) one: it consumes significantly
less energy on large datasets.
Python — Energy impact comparison of search algorithms
# INEFFICIENT search: O(n) - linear scan on unsorted list
def find_user_linear(users: list, target_id: str) -> dict | None:
"""Linear scan: consumes energy proportional to N."""
for user in users:
if user["id"] == target_id:
return user
return None
# EFFICIENT search: O(1) - dictionary lookup (hash map)
def build_user_index(users: list) -> dict:
"""Index construction: one-time O(n) investment."""
return {user["id"]: user for user in users}
def find_user_indexed(index: dict, target_id: str) -> dict | None:
"""O(1) lookup: constant energy consumption."""
return index.get(target_id)
# Impact on 1 million users with 10,000 searches:
# - Linear: ~10 billion comparisons = ~15 Wh
# - Indexed: ~10,000 lookups = ~0.001 Wh
# Savings: 99.99% energy for the same functionality
2. Strategic Caching
Every request to a database or external API has an energy cost. Caching reduces
repeated computations, eliminating unnecessary work and the associated emissions.
TypeScript — Multi-level cache with TTL to reduce emissions
interface CacheEntry<T> {
readonly data: T;
readonly timestamp: number;
readonly ttlMs: number;
}
class GreenCache<T> {
private readonly cache = new Map<string, CacheEntry<T>>();
private hits = 0;
private misses = 0;
get(key: string): T | undefined {
const entry = this.cache.get(key);
if (!entry) {
this.misses++;
return undefined;
}
const isExpired = Date.now() - entry.timestamp > entry.ttlMs;
if (isExpired) {
this.cache.delete(key);
this.misses++;
return undefined;
}
this.hits++;
return entry.data;
}
set(key: string, data: T, ttlMs: number = 300_000): void {
// Immutable: we create a new entry, we do not mutate
const entry: CacheEntry<T> = {
data,
timestamp: Date.now(),
ttlMs,
};
this.cache.set(key, entry);
}
/** Efficiency metrics: a high hit rate = less energy consumed */
getEfficiencyReport(): object {
const total = this.hits + this.misses;
const hitRate = total > 0 ? (this.hits / total) * 100 : 0;
// Estimate: each cache hit saves ~0.5-2ms of CPU + network I/O
const estimatedSavedWh = this.hits * 0.000002; // ~2mWh per hit
return {
hits: this.hits,
misses: this.misses,
hitRate: hitRate.toFixed(1) + '%',
estimatedSavedWh: estimatedSavedWh.toFixed(6),
};
}
}
// Usage:
// const cache = new GreenCache<UserProfile>();
// cache.set('user:123', profile, 600_000); // TTL 10 minutes
// const cached = cache.get('user:123'); // O(1), zero network
3. Operation Batching
Aggregating multiple operations into a single request reduces network overhead, serialization/deserialization,
and the CPU cycles dedicated to connection management.
TypeScript — API request batching for carbon efficiency
Energy Efficiency means having software perform the same amount of work
with less energy. Energy is the intermediate factor between software and carbon emissions:
by reducing the energy consumed, emissions are proportionally reduced, regardless of the
energy source.
Energy efficiency in software is achieved at multiple levels: application (algorithms
and architecture), runtime (resource management and concurrency), and infrastructure
(server sizing and auto-scaling).
Energy Consumption by Programming Language
Language
Relative Energy
Relative Time
Relative Memory
C
1.00x (baseline)
1.00x
1.00x
Rust
1.03x
1.04x
1.54x
Java
1.98x
1.89x
6.01x
Go
3.23x
2.83x
1.05x
TypeScript
21.50x
46.20x
4.69x
Python
75.88x
71.90x
2.80x
Source: "Energy Efficiency across Programming Languages" — University of Minho (2017, updated 2021)
Caution: Language Is Not Everything
The data above measure synthetic benchmarks. In the real world, algorithm choice, architecture,
and I/O patterns often have a greater impact than language choice.
A well-designed Python application with intelligent caching can be greener
than a poorly architected C application that recalculates everything on every request.
Energy Efficiency Patterns in Code
Java — Lazy Loading and deferred computation to save energy
/**
* Green Pattern: Lazy Initialization
* Compute expensive resources ONLY when actually needed.
* Estimated savings: 30-70% CPU for requests that do not access
* all object properties.
*/
public final class LazyReport {
private final String reportId;
private final Supplier<ReportData> dataSupplier;
private volatile ReportData cachedData;
public LazyReport(String reportId, Supplier<ReportData> dataSupplier) {
this.reportId = reportId;
this.dataSupplier = dataSupplier;
// We do NOT compute data in the constructor
}
public ReportData getData() {
// Double-checked locking: compute only on first access
ReportData result = cachedData;
if (result == null) {
synchronized (this) {
result = cachedData;
if (result == null) {
cachedData = result = dataSupplier.get();
}
}
}
return result;
}
public String getReportId() {
return reportId; // Zero CPU: no expensive computation
}
}
// Energy comparison:
// - Eager: 100 reports created, all compute data = 100 DB queries
// - Lazy: 100 reports created, only 10 accessed = 10 DB queries
// Savings: 90% energy for unnecessary queries
Python — Generators for memory-efficient streaming processing
import csv
from typing import Generator, Iterator
def process_large_csv_green(filepath: str) -> Generator[dict, None, None]:
"""
Green Pattern: Streaming with generator.
Process files with millions of rows without loading everything into memory.
- Memory: O(1) instead of O(n)
- Energy: proportional only to rows actually processed
"""
with open(filepath, 'r', encoding='utf-8') as f:
reader = csv.DictReader(f)
for row in reader:
# Yield one row at a time: zero bulk allocation
yield {
"id": row["id"],
"value": float(row["amount"]),
"processed": True,
}
def aggregate_green(rows: Iterator[dict], limit: int = 1000) -> dict:
"""
Streaming aggregation: compute statistics without
materializing the entire collection in memory.
"""
total = 0.0
count = 0
max_val = float('-inf')
for row in rows:
total += row["value"]
count += 1
if row["value"] > max_val:
max_val = row["value"]
if count >= limit:
break # Early termination: don't waste energy
return {
"count": count,
"total": round(total, 2),
"average": round(total / count, 2) if count > 0 else 0,
"max": max_val,
}
# Comparison:
# Anti-pattern: data = list(csv.DictReader(f)) # Loads EVERYTHING into RAM
# - 2GB file = 2GB RAM + potential swap + GC overhead
# Green pattern: use generator + early termination
# - 2GB file = ~1KB constant RAM, stops as soon as it has enough data
Principle 3: Carbon Awareness
Carbon Awareness is the most innovative principle: doing the same
amount of work, but at the time or place where electricity is cleanest.
The carbon intensity of the electrical grid varies enormously based on the time of day, season,
weather conditions, and the energy mix of the region.
Grid Carbon Intensity by Country (2024-2025)
Country/Region
gCO₂eq/kWh (average)
Main Energy Mix
Classification
Sweden
~12
Hydroelectric + Nuclear
Very Low
France
~56
Nuclear (70%)
Low
Canada
~120
Hydroelectric + Gas
Medium
Germany
~350
Renewables + Coal
High
USA (average)
~390
Gas + Coal + Renewables
High
Poland
~650
Coal (dominant)
Very High
India
~700
Coal (predominant)
Very High
Carbon Awareness is implemented with two complementary strategies defined by the GSF:
Temporal Shifting
Moving workloads in time to run them when the grid's carbon intensity
is lower. Example: scheduling batch jobs at night when wind power produces more.
Studies show reductions of 45-55% in emissions without
impact on perceived performance.
Spatial Shifting
Moving workloads in space, running them in data centers located in regions
with cleaner energy. If your cloud provider has regions in Sweden (12 gCO₂/kWh)
and in Poland (650 gCO₂/kWh), choosing Sweden reduces emissions by 98%
for the same workload.
APIs and Tools for Carbon Awareness
Several services provide real-time data on grid carbon intensity,
allowing software to make informed decisions.
Main Carbon Intensity Data Providers
Provider
Coverage
Data Type
Access
Electricity Maps
Global (160+ zones)
Real-time + forecast
API (free tier available)
WattTime
USA, Europe, Australia
Marginal emissions
API (free registration)
Carbon Intensity UK
Great Britain
96h forecast + historical
Free public API
Ember
Global (200+ countries)
Annual/monthly average
Open data dataset
TypeScript — Carbon-aware scheduler with Electricity Maps API
#123;zone}`,
{
headers: {
'auth-token': process.env['ELECTRICITY_MAPS_TOKEN'] ?? '',
},
}
);
if (!response.ok) {
throw new Error(`Carbon API error: #123;response.status}`);
}
return response.json();
}
async function shouldRunWorkload(
zone: string,
priority: 'low' | 'medium' | 'high'
): Promise<ScheduleDecision> {
const data = await getCarbonIntensity(zone);
const intensity = data.carbonIntensity;
// High priority: always run (but log emissions)
if (priority === 'high') {
return {
shouldRun: true,
currentIntensity: intensity,
threshold: Infinity,
reason: 'High priority: executing regardless of carbon intensity',
};
}
// Medium priority: run below high threshold
if (priority === 'medium') {
return {
shouldRun: intensity < CARBON_THRESHOLD_HIGH,
currentIntensity: intensity,
threshold: CARBON_THRESHOLD_HIGH,
reason: intensity < CARBON_THRESHOLD_HIGH
? 'Medium priority: intensity acceptable'
: 'Medium priority: deferring to lower intensity period',
};
}
// Low priority: run only with clean energy
return {
shouldRun: intensity < CARBON_THRESHOLD_LOW,
currentIntensity: intensity,
threshold: CARBON_THRESHOLD_LOW,
reason: intensity < CARBON_THRESHOLD_LOW
? 'Low priority: clean energy available'
: 'Low priority: waiting for cleaner energy window',
};
}
// Usage example:
// const decision = await shouldRunWorkload('IT-SO', 'low');
// if (decision.shouldRun) {
// await runBatchJob();
// } else {
// scheduleRetry(decision);
// }
GSF's Carbon Aware SDK
The Green Software Foundation develops the Carbon Aware SDK, an open-source tool
that exposes a REST API and a CLI for querying carbon intensity data from multiple providers
(WattTime, Electricity Maps). The SDK allows implementing both temporal shifting and spatial
shifting with just a few lines of code, directly integrating carbon awareness into CI/CD pipelines
and orchestration systems.
Principle 4: Hardware Efficiency
Hardware Efficiency is the most underestimated principle. Manufacturing
an electronic device generates enormous amounts of CO₂: this is the
embodied carbon. For a laptop, approximately 70-80%
of the total emissions over its lifecycle come from manufacturing, not from use.
Embodied Carbon of Common Devices
Device
Embodied Carbon (kgCO₂eq)
Average Lifespan
Annual Amortization
Smartphone
~70
3 years
~23 kgCO₂eq/year
Laptop
~300-400
4 years
~75-100 kgCO₂eq/year
Server (rack)
~1,000-2,000
5-6 years
~200-400 kgCO₂eq/year
GPU Server (AI)
~5,000-8,000
3-5 years
~1,000-2,600 kgCO₂eq/year
As developers, we can influence hardware efficiency in two ways:
Extending Hardware Lifespan
Write software compatible with older hardware
Avoid dependencies on unnecessary hardware-specific features
Support graceful degradation on older devices
Optimize to reduce minimum system requirements
Maximizing Utilization
Prefer serverless architectures or optimized containers
Implement auto-scaling to avoid idle servers
Use spot/preemptible instances for non-critical workloads
Consolidate services to increase average server utilization
The Impact of Right-Sizing
A server with average utilization of 15% wastes 85% of the embodied carbon invested.
Raising utilization to 60% reduces the environmental cost per unit of work by
4x. Cloud computing and serverless help enormously with this,
sharing hardware among thousands of users and amortizing embodied carbon across more workloads.
Principle 5: Measurement — The SCI Specification
"If you can't measure it, you can't improve it." The Measurement principle is the bridge
between theory and practice. The Software Carbon Intensity (SCI) is the specification
developed by the GSF to quantify the carbon emissions of a software application. In 2024
it was ratified as the international standard ISO/IEC 21031:2024.
The SCI Formula
The SCI formula calculates a rate, not a total. This is fundamental: the goal
is to reduce carbon intensity per unit of work, not simply the total
emissions (which could decrease simply by reducing software usage).
Per API call, per user, per transaction, per token
Practical Example: SCI Calculation for a REST API
Let us calculate the SCI score for an API microservice handling 100,000 requests per day,
hosted on a cloud server in Germany.
SCI Calculation — REST API example in Germany
# === Input Data ===
# E (Energy): the server consumes an average of 150W
# Runs 24h/day = 150W * 24h = 3,600 Wh = 3.6 kWh/day
E = 3.6 # kWh/day
# I (Carbon intensity): average German grid
I = 350 # gCO2eq/kWh (Germany average 2024-2025)
# M (Embodied carbon): rack server with 5-year lifespan
# Total embodied: 1,500 kgCO2eq
# Daily amortization: 1,500,000 / (5 * 365) = 821.9 gCO2eq/day
M = 822 # gCO2eq/day
# R (Functional unit): 100,000 API requests per day
R = 100_000
# === SCI Calculation ===
# Operational emissions: E * I = 3.6 * 350 = 1,260 gCO2eq/day
operational = E * I # 1,260 gCO2eq
# Daily total: (E * I) + M = 1,260 + 822 = 2,082 gCO2eq/day
total_daily = operational + M # 2,082 gCO2eq
# SCI = total / R = 2,082 / 100,000 = 0.02082 gCO2eq per request
SCI = total_daily / R # 0.02082 gCO2eq/request
# === Comparison: same API hosted in Sweden ===
I_sweden = 12 # gCO2eq/kWh
operational_sweden = E * I_sweden # 3.6 * 12 = 43.2 gCO2eq
total_sweden = operational_sweden + M # 43.2 + 822 = 865.2 gCO2eq
SCI_sweden = total_sweden / R # 0.00865 gCO2eq/request
# Reduction: from 0.02082 to 0.00865 = -58.4% emissions
# Just by changing the hosting region!
Choosing the Functional Unit (R)
The functional unit R is crucial and must reflect the real value of the software.
A poor choice of R can make the SCI score misleading. Here are the recommended choices by application type:
APIs/Microservices: per API request or per completed transaction
Web Applications: per active user or per session
AI/ML Models: per inference, per generated token, or per training run
Batch Processing: per processed record or per GB processed
Streaming: per minute of streaming or per GB transferred
SCI for AI: The New Standard
In 2025, the GSF ratified the SCI for AI specification, an extension of the SCI
that covers the entire lifecycle of AI systems: from training to inference, including fine-tuning
and RAG. The specification supports different types of systems: classical machine learning, computer vision,
NLP, generative AI, and agentic systems, with dedicated functional units such as
tokens for language models and inference for classifiers.
Python — Automated SCI calculation with CodeCarbon
from codecarbon import EmissionsTracker
from dataclasses import dataclass
@dataclass(frozen=True)
class SCIResult:
"""Immutable result of the SCI calculation."""
operational_emissions_g: float # E * I in gCO2eq
embodied_emissions_g: float # M in gCO2eq
functional_units: int # R
sci_score: float # SCI in gCO2eq/R
region: str
@property
def total_emissions_g(self) -> float:
return self.operational_emissions_g + self.embodied_emissions_g
def measure_api_sci(
handler_fn,
requests_count: int,
server_embodied_gco2: float = 822.0, # daily
country_iso_code: str = "DEU",
) -> SCIResult:
"""
Measures the SCI score of an API function using CodeCarbon.
CodeCarbon automatically tracks E and computes E*I based on
the local grid.
"""
tracker = EmissionsTracker(
project_name="api-sci-measurement",
country_iso_code=country_iso_code,
log_level="warning",
)
tracker.start()
# Run the workload
for _ in range(requests_count):
handler_fn()
# Get operational emissions (E * I) in kg, convert to grams
emissions_kg = tracker.stop()
operational_g = emissions_kg * 1000
# Calculate SCI
total = operational_g + server_embodied_gco2
sci = total / requests_count if requests_count > 0 else 0
return SCIResult(
operational_emissions_g=round(operational_g, 4),
embodied_emissions_g=server_embodied_gco2,
functional_units=requests_count,
sci_score=round(sci, 6),
region=country_iso_code,
)
# Usage example:
# result = measure_api_sci(my_handler, requests_count=10000)
# print(f"SCI Score: {result.sci_score} gCO2eq/request")
# print(f"Total: {result.total_emissions_g} gCO2eq")
Principle 6: Climate Commitments
Understanding the terminology of climate commitments is essential to avoid
greenwashing. The GSF clearly distinguishes three approaches, with very different
levels of effectiveness.
Climate Commitments Comparison
Approach
Definition
Effectiveness
Risks
Carbon Neutral
Offset emissions with carbon credits
Medium
Variable credit quality, does not reduce actual emissions
Net Zero
Reduce emissions by 90%+ and offset the remainder
High
Timelines often distant (2040-2050)
Carbon Abatement
Eliminate emissions at the source
Maximum
High upfront cost, requires deep transformation
Beware of Greenwashing
The SCI specification does not allow offsets as a mechanism to
reduce the SCI score. Only real emission reductions — through less energy, cleaner
energy, or more efficient hardware — lower the SCI. This is a
deliberate choice by the GSF to incentivize abatement (elimination) over
neutralization (offsetting).
For us as developers, this principle translates to: reduce first, then offset.
Every code optimization, every more efficient architectural choice, every migration to a greener
cloud region is real abatement. Carbon credits are the last resort.
Principle 7: Networking Efficiency
Every byte transferred over the network has an energy cost. Data transmission over the Internet consumes energy
for routers, switches, cellular antennas, and submarine cables. Reducing network traffic
is a direct lever for reducing emissions.
Strategies to Reduce Network Traffic
Compression: enable gzip/brotli for HTTP responses (60-90% reduction)
Image optimization: use modern formats (WebP, AVIF) and responsive images
Lazy loading: load resources only when needed
Efficient APIs: GraphQL to request only the necessary fields, pagination, server-side filters
CDN: serve content from the edge location closest to the user
Cache headers: ETag, Cache-Control to avoid redundant transfers
Code splitting: load only the JavaScript needed for the current page
HTTP Headers — Green configuration to reduce traffic
# Nginx: optimized configuration for green networking
# Brotli compression (superior to gzip: -20% size)
brotli on;
brotli_comp_level 6;
brotli_types text/html text/css application/javascript application/json;
# Aggressive caching for static assets (1 year)
location /assets/ {
expires 1y;
add_header Cache-Control "public, immutable";
add_header Vary "Accept-Encoding";
}
# Cache for API responses with ETag
location /api/ {
add_header Cache-Control "public, max-age=300"; # 5 minutes
etag on;
}
# Estimated savings on 1M pages/day:
# - Without compression: ~2TB/day transferred
# - With Brotli + cache: ~200GB/day transferred
# - Reduction: ~90% traffic = ~90% network energy
Principle 8: Demand Shaping
Demand Shaping is the most ambitious principle: instead of adapting
software to available energy (as in Carbon Awareness), we shape the demand itself
based on energy conditions. This can mean offering a reduced version of the service during
periods of high carbon intensity, or gracefully degrading more expensive features.
TypeScript — Demand Shaping: adaptive quality based on carbon intensity
interface VideoQualityConfig {
readonly resolution: '4K' | '1080p' | '720p' | '480p';
readonly bitrate: number; // kbps
readonly estimatedKwhPerHour: number;
}
const QUALITY_PROFILES: Record<string, VideoQualityConfig> = {
ultra: { resolution: '4K', bitrate: 25000, estimatedKwhPerHour: 0.042 },
high: { resolution: '1080p', bitrate: 8000, estimatedKwhPerHour: 0.018 },
medium: { resolution: '720p', bitrate: 5000, estimatedKwhPerHour: 0.012 },
eco: { resolution: '480p', bitrate: 2500, estimatedKwhPerHour: 0.007 },
};
function selectQualityByCarbon(
carbonIntensity: number, // gCO2eq/kWh
userPreference: string = 'auto'
): VideoQualityConfig {
// If the user has chosen manually, respect the choice
if (userPreference !== 'auto') {
return QUALITY_PROFILES[userPreference] ?? QUALITY_PROFILES['high'];
}
// Demand shaping: adapt quality to available energy
if (carbonIntensity < 50) {
return QUALITY_PROFILES['ultra']; // Clean energy: maximum quality
}
if (carbonIntensity < 200) {
return QUALITY_PROFILES['high']; // Moderate: high quality
}
if (carbonIntensity < 400) {
return QUALITY_PROFILES['medium']; // High: reduced quality
}
return QUALITY_PROFILES['eco']; // Critical: eco mode
}
// Transparency: show the user the reason for the choice
function getEcoMessage(config: VideoQualityConfig, intensity: number): string {
const saved = QUALITY_PROFILES['ultra'].estimatedKwhPerHour
- config.estimatedKwhPerHour;
const savedPercent = (saved / QUALITY_PROFILES['ultra'].estimatedKwhPerHour * 100)
.toFixed(0);
return `Streaming in #123;config.resolution} (grid: #123;intensity} gCO2/kWh). `
+ `Energy savings: #123;savedPercent}% compared to 4K.`;
}
Demand Shaping in the Real World
Companies like Google and Microsoft already implement forms of demand
shaping in their data centers, shifting indexing and ML training workloads to times and regions
with cleaner energy. Even consumer services like the eco mode of
some video streaming services are examples of demand shaping applied on the user side.
Tools and Frameworks for Measurement
The green software tooling ecosystem is growing rapidly. Here are the most
mature and widely adopted tools for measuring and reducing the carbon impact of software.
Measurement Tools Comparison
Tool
Type
Languages/Platforms
Use Case
CodeCarbon
Python library
Python (PyTorch, TF, HF)
Emissions tracking for ML training and scripts
Cloud Carbon Footprint
Dashboard + CLI
AWS, GCP, Azure
Cloud infrastructure emissions monitoring
Carbon Aware SDK
SDK + REST API
Polyglot (.NET, REST)
Carbon-aware scheduling and region optimization
Scaphandre
Monitoring agent
Linux (bare metal + VM)
Process-level power monitoring
Green Metrics Tool
CI/CD platform
Any (container-based)
Energy benchmarking in CI pipelines
Climatiq API
REST API
Any
Emission factors for LCA calculations
Cumulator
Python toolkit
Python
Carbon footprint for computation + ML data transfer
CodeCarbon: Emissions Tracking in Python
CodeCarbon is the most popular library for tracking CO₂ emissions
in Python code. It monitors CPU, GPU, and RAM during execution and combines the data with regional
carbon intensity to estimate emissions. It integrates natively with PyTorch, TensorFlow, and
Hugging Face.
Python — CodeCarbon: decorator for automatic tracking
from codecarbon import track_emissions, EmissionsTracker
import pandas as pd
import numpy as np
# Method 1: Decorator (simpler)
@track_emissions(
project_name="data-pipeline",
output_dir="./emissions",
country_iso_code="ITA",
)
def run_data_pipeline(input_path: str) -> pd.DataFrame:
"""Data processing pipeline with automatic tracking."""
df = pd.read_csv(input_path)
# ... transformations ...
result = df.groupby("category").agg({
"revenue": ["sum", "mean"],
"transactions": "count",
})
return result
# Method 2: Context manager (more control)
def train_model_green(X_train, y_train, X_test, y_test):
"""Training with granular emissions measurement."""
tracker = EmissionsTracker(
project_name="model-training",
measure_power_secs=15, # Sample every 15 seconds
tracking_mode="process", # Current process only
country_iso_code="ITA",
save_to_api=False, # Save locally only
)
tracker.start()
# Phase 1: Preprocessing
tracker.start_task("preprocessing")
X_scaled = preprocess(X_train)
emissions_preprocess = tracker.stop_task("preprocessing")
# Phase 2: Training
tracker.start_task("training")
model = train(X_scaled, y_train)
emissions_training = tracker.stop_task("training")
# Phase 3: Evaluation
tracker.start_task("evaluation")
metrics = evaluate(model, X_test, y_test)
emissions_eval = tracker.stop_task("evaluation")
total_emissions = tracker.stop()
return {
"model": model,
"metrics": metrics,
"emissions": {
"preprocessing_kgCO2": emissions_preprocess,
"training_kgCO2": emissions_training,
"evaluation_kgCO2": emissions_eval,
"total_kgCO2": total_emissions,
},
}
Cloud Carbon Footprint (CCF) is an open-source tool that estimates CO₂ emissions
from cloud infrastructure on AWS, Google Cloud, and Azure. It analyzes billing and
usage data to calculate the energy consumed and related emissions, offering an interactive dashboard
and optimization recommendations.
According to the State of Green Software 2025 report by the GSF, the adoption of green
software principles is growing rapidly. The green data center market is valued at
approximately $95 billion in 2025 and is projected to reach $396 billion
by 2034. The software component is the fastest-growing category,
with a CAGR of 30.4%.
Big Tech Green Software Initiatives
Company
Objective
Key Initiatives
Microsoft
Carbon negative by 2030
Emissions Impact Dashboard, carbon-aware Windows Update
Open sourcing sustainable AI research, data center efficiency
The Role of Regulation
The European Union is leading regulation with the Corporate Sustainability Reporting
Directive (CSRD), which requires large companies to report Scope 1, 2, and 3 emissions.
For software companies, Scope 3 (indirect emissions from the supply chain, including software use
by customers) often represents 80-90% of the total. This makes SCI measurement no longer
optional, but a compliance requirement.
Practical Roadmap: From Zero to Green Software
Implementing green software principles does not require a revolution. Here is a progressive roadmap
for integrating carbon efficiency into the development cycle.
Phase 1: Measure (Week 1-2)
Integrate CodeCarbon or Cloud Carbon Footprint into your project
Define the functional unit (R) for your software
Calculate the baseline SCI score
Document the current energy consumption of key services
Phase 2: Optimize (Week 3-6)
Implement strategic caching to reduce repeated computations
Enable compression and lazy loading in the frontend
Verify server sizing (right-sizing)
Phase 3: Carbon-Aware (Week 7-10)
Integrate a carbon intensity API (Electricity Maps, WattTime)
Implement temporal shifting for non-critical batch jobs
Evaluate spatial shifting: is the current cloud region the greenest?
Add carbon-aware metrics to the monitoring dashboard
Phase 4: Culture (Ongoing)
Integrate the SCI score into CI/CD as a quality metric
Train the team on the GSF's Green Software Practitioner Certification
Include carbon impact in code reviews
Share results: transparency on emissions and progress
Quick Wins: The 5 Highest-Impact Actions
1. Enable Brotli compression on all HTTP responses (→ -70% network traffic)
2. Implement caching with appropriate TTLs (→ -50-90% DB queries)
3. Choose the greenest available cloud region (→ up to -98% operational emissions)
4. Right-size servers and enable auto-scaling (→ -30-60% idle energy)
5. Use lazy loading and code splitting in the frontend (→ -40-60% data transfer)
Conclusion
The 8 principles of the Green Software Foundation are not a theoretical framework distant from
the daily reality of development. They are a practical lens through which to evaluate every
technical decision: from algorithm choice to deployment region, from caching strategy to
scaling policy.
The SCI specification (ISO/IEC 21031:2024) provides the common language for measuring
progress, and the tooling ecosystem — from CodeCarbon to the Carbon Aware SDK — makes
it possible to move from theory to practice with minimal investment.
With data centers on track to consume over 1,000 TWh annually and European
regulation (CSRD) requiring emissions transparency, green software is no longer a
"nice to have": it is a competitive and regulatory requirement. The best time to
start was yesterday. The second best time is now.
"The greenest software is not the one that offsets its emissions, but the one that
produces the fewest per unit of value. True sustainability is measured with the SCI,
not with carbon credits."
— Founding principle of the Green Software Foundation
Resources to Learn More
Learn Green Software — learn.greensoftware.foundation (free course)
Green Software Practitioner — Official GSF certification
In the next article in the series, we will explore sustainable design patterns
with specific architectures for microservices, frontend, and data pipelines, with measurable savings
metrics for each pattern.