Hands-on Activities: Host static website, Automate backup using lifecycle rules

Hands-on Activities: Host Static Website & Automate Backup Using Lifecycle Rules

Introduction

In modern cloud computing practices, hands-on activities play a crucial role in understanding how real-world architectures are designed, deployed, secured, and optimized. Two highly practical and frequently used tasks in Amazon Web Services (AWS) include hosting a static website using Amazon S3 and automating backup management using S3 Lifecycle Rules. These tasks are essential for students, developers, cloud engineers, and architects aiming to build scalable, cost-efficient, and high-performing cloud applications.

This detailed guide explains every concept step-by-step, from the fundamentals to implementation, best practices, and AWS exam-relevant knowledge. The document is SEO-friendly and includes all essential keywords related to S3 static website hosting, lifecycle management, and automation, ensuring maximum reach and discoverability across search platforms.

Core AWS Services Used

Amazon Simple Storage Service (S3) is a highly scalable, durable, and secure object storage service designed to store and retrieve any amount of data from anywhere on the internet. S3 is widely used for hosting static websites, data lakes, content archival, logging, image hosting, disaster recovery, versioning, and automated data lifecycle management.

 Static Website Hosting

S3 is an excellent choice for static website hosting because it offers:

  • Near-infinite scalability
  • High durability (99.999999999%)
  • Low latency
  • Cost-effective storage
  • Simple deployment model
  • Integration with CloudFront, Route 53, logging, and monitoring

 Lifecycle Rules

Lifecycle rules in Amazon S3 help automate object transitions between storage classes and define when objects should be archived or permanently deleted. These rules enable:

  • Cost optimization
  • Automated backups
  • Data archival
  • Retention policy enforcement
  • Compliance-oriented data management

Hands-on Activity 1: Hosting a Static Website 

This section includes a complete step-by-step activity for hosting a static website using S3 with detailed instructions, screenshots explanation, and production-level best practices.

Step 1: Create an Bucket

To host a static website, begin by creating an S3 bucket. The bucket name must be globally unique because AWS S3 is a global service.

Steps to create a bucket:

  • Open AWS Management Console
  • Navigate to S3
  • Click Create bucket
  • Enter bucket name (example: my-static-website-demo)
  • Select AWS Region
  • Disable block public access settings for website hosting

Example bucket structure:


my-static-website-demo
 β”œβ”€β”€ index.html
 └── error.html

Step 2: Upload Website Files

You can upload HTML, CSS, JavaScript, images, and other static assets. AWS S3 does not support server-side execution such as PHP or Node.js; only static files are allowed.

Example index.html:

Static Website Hosted on Amazon S3

This is a demo webpage hosted entirely using S3 static hosting.

Step 3: Enable Static Website Hosting

  • Select the bucket
  • Go to the Properties tab
  • Scroll to the bottom and choose Static website hosting
  • Choose Enable
  • Specify index.html and error.html

This will generate a website endpoint such as:

http://my-static-website-demo.s3-website-ap-south-1.amazonaws.com

Step 4: Update Bucket Policy for Public Access

Since static website hosting requires public read access, you must attach a bucket policy that allows anonymous GET requests.

Example S3 bucket policy:

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Principal": "*",
      "Action": "s3:GetObject",
      "Resource": "arn:aws:s3:::my-static-website-demo/*"
    }
  ]
}

Step 5: Test Your Static Website

Open the S3 website endpoint in the browser. You should see your webpage load instantly.

Bonus: Host Website with Custom Domain

You can integrate:

  • Amazon Route 53 (Domain Management)
  • Amazon Certificate Manager (SSL/TLS)
  • Amazon CloudFront (CDN)

This enhances speed, resilience, and global performance.

Hands-on Activity 2: Automating Backup Using S3 Lifecycle Rules

Lifecycle rules help automate data transition and deletion. This hands-on activity demonstrates how to set up lifecycle rules for automated backup and archival.

  • Move logs to Glacier after 30 days
  • Delete outdated backup files after 365 days
  • Transition infrequently used files to Standard-IA
  • Auto-clean temporary storage buckets

Step 1: Enable Versioning on the Bucket

Versioning ensures multiple versions of the same object are preserved. This is crucial for backup automation.

  • Go to the bucket
  • Open Properties
  • Enable Versioning

Step 2: Create a Lifecycle Rule

  • Open the bucket
  • Go to Management tab
  • Select Create lifecycle rule

Assign a rule name such as automated-backup-rule.

Step 3: Configure Lifecycle Transitions

Example production-grade lifecycle rule workflow:

  • Day 0–30 β†’ S3 Standard
  • Day 31–90 β†’ S3 Standard-Infrequent Access
  • Day 91–365 β†’ S3 Glacier Flexible Retrieval
  • After 365 days β†’ Delete or archive permanently

Example JSON configuration:


{
  "Rules": [
    {
      "ID": "backup-automation",
      "Status": "Enabled",
      "Filter": {},
      "Transitions": [
        {
          "Days": 30,
          "StorageClass": "STANDARD_IA"
        },
        {
          "Days": 90,
          "StorageClass": "GLACIER"
        }
      ],
      "Expiration": {
        "Days": 365
      }
    }
  ]
}

Step 4: Validate Rule Execution

AWS automatically applies lifecycle policies in the background. You can verify transitions through S3 metrics, object details, or AWS CloudTrail logs.

Static Website Hosting and Lifecycle Automation

  • Enable S3 Block Public Access for non-web buckets
  • Use CloudFront for secure public distribution
  • Apply IAM policies using the principle of least privilege
  • Enable server access logging
  • Use CloudFront CDN for global acceleration
  • Compress images using S3 Object Lambda
  • Enable transfer acceleration for large uploads

Cost Optimization Best Practices

  • Use lifecycle rules to move infrequently used objects
  • Leverage S3 Storage Lens for monitoring
  • Enable Intelligent-Tiering

Backup 

  • Enable versioning for rollback protection
  • Use replication across AWS regions
  • Configure automatic deletion of old versions

This hands-on guide covered the complete process of hosting a static website on Amazon S3 and configuring automated backup using lifecycle rules. Both of these tasks form the foundation of real-world cloud workloads and help learners build confidence in cloud storage, automation, and deployment strategies. By understanding these concepts, you can easily create scalable, high-performance, and cost-efficient cloud infrastructure suitable for enterprise and production environments.

logo

AWS

Beginner 5 Hours

Hands-on Activities: Host Static Website & Automate Backup Using Lifecycle Rules

Introduction

In modern cloud computing practices, hands-on activities play a crucial role in understanding how real-world architectures are designed, deployed, secured, and optimized. Two highly practical and frequently used tasks in Amazon Web Services (AWS) include hosting a static website using Amazon S3 and automating backup management using S3 Lifecycle Rules. These tasks are essential for students, developers, cloud engineers, and architects aiming to build scalable, cost-efficient, and high-performing cloud applications.

This detailed guide explains every concept step-by-step, from the fundamentals to implementation, best practices, and AWS exam-relevant knowledge. The document is SEO-friendly and includes all essential keywords related to S3 static website hosting, lifecycle management, and automation, ensuring maximum reach and discoverability across search platforms.

Core AWS Services Used

Amazon Simple Storage Service (S3) is a highly scalable, durable, and secure object storage service designed to store and retrieve any amount of data from anywhere on the internet. S3 is widely used for hosting static websites, data lakes, content archival, logging, image hosting, disaster recovery, versioning, and automated data lifecycle management.

 Static Website Hosting

S3 is an excellent choice for static website hosting because it offers:

  • Near-infinite scalability
  • High durability (99.999999999%)
  • Low latency
  • Cost-effective storage
  • Simple deployment model
  • Integration with CloudFront, Route 53, logging, and monitoring

 Lifecycle Rules

Lifecycle rules in Amazon S3 help automate object transitions between storage classes and define when objects should be archived or permanently deleted. These rules enable:

  • Cost optimization
  • Automated backups
  • Data archival
  • Retention policy enforcement
  • Compliance-oriented data management

Hands-on Activity 1: Hosting a Static Website 

This section includes a complete step-by-step activity for hosting a static website using S3 with detailed instructions, screenshots explanation, and production-level best practices.

Step 1: Create an Bucket

To host a static website, begin by creating an S3 bucket. The bucket name must be globally unique because AWS S3 is a global service.

Steps to create a bucket:

  • Open AWS Management Console
  • Navigate to S3
  • Click Create bucket
  • Enter bucket name (example: my-static-website-demo)
  • Select AWS Region
  • Disable block public access settings for website hosting

Example bucket structure:

my-static-website-demo ├── index.html └── error.html

Step 2: Upload Website Files

You can upload HTML, CSS, JavaScript, images, and other static assets. AWS S3 does not support server-side execution such as PHP or Node.js; only static files are allowed.

Example index.html:

Static Website Hosted on Amazon S3

This is a demo webpage hosted entirely using S3 static hosting.

Step 3: Enable Static Website Hosting

  • Select the bucket
  • Go to the Properties tab
  • Scroll to the bottom and choose Static website hosting
  • Choose Enable
  • Specify index.html and error.html

This will generate a website endpoint such as:

http://my-static-website-demo.s3-website-ap-south-1.amazonaws.com

Step 4: Update Bucket Policy for Public Access

Since static website hosting requires public read access, you must attach a bucket policy that allows anonymous GET requests.

Example S3 bucket policy:

{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": "*", "Action": "s3:GetObject", "Resource": "arn:aws:s3:::my-static-website-demo/*" } ] }

Step 5: Test Your Static Website

Open the S3 website endpoint in the browser. You should see your webpage load instantly.

Bonus: Host Website with Custom Domain

You can integrate:

  • Amazon Route 53 (Domain Management)
  • Amazon Certificate Manager (SSL/TLS)
  • Amazon CloudFront (CDN)

This enhances speed, resilience, and global performance.

Hands-on Activity 2: Automating Backup Using S3 Lifecycle Rules

Lifecycle rules help automate data transition and deletion. This hands-on activity demonstrates how to set up lifecycle rules for automated backup and archival.

  • Move logs to Glacier after 30 days
  • Delete outdated backup files after 365 days
  • Transition infrequently used files to Standard-IA
  • Auto-clean temporary storage buckets

Step 1: Enable Versioning on the Bucket

Versioning ensures multiple versions of the same object are preserved. This is crucial for backup automation.

  • Go to the bucket
  • Open Properties
  • Enable Versioning

Step 2: Create a Lifecycle Rule

  • Open the bucket
  • Go to Management tab
  • Select Create lifecycle rule

Assign a rule name such as automated-backup-rule.

Step 3: Configure Lifecycle Transitions

Example production-grade lifecycle rule workflow:

  • Day 0–30 → S3 Standard
  • Day 31–90 → S3 Standard-Infrequent Access
  • Day 91–365 → S3 Glacier Flexible Retrieval
  • After 365 days → Delete or archive permanently

Example JSON configuration:

{ "Rules": [ { "ID": "backup-automation", "Status": "Enabled", "Filter": {}, "Transitions": [ { "Days": 30, "StorageClass": "STANDARD_IA" }, { "Days": 90, "StorageClass": "GLACIER" } ], "Expiration": { "Days": 365 } } ] }

Step 4: Validate Rule Execution

AWS automatically applies lifecycle policies in the background. You can verify transitions through S3 metrics, object details, or AWS CloudTrail logs.

Static Website Hosting and Lifecycle Automation

  • Enable S3 Block Public Access for non-web buckets
  • Use CloudFront for secure public distribution
  • Apply IAM policies using the principle of least privilege
  • Enable server access logging
  • Use CloudFront CDN for global acceleration
  • Compress images using S3 Object Lambda
  • Enable transfer acceleration for large uploads

Cost Optimization Best Practices

  • Use lifecycle rules to move infrequently used objects
  • Leverage S3 Storage Lens for monitoring
  • Enable Intelligent-Tiering

Backup 

  • Enable versioning for rollback protection
  • Use replication across AWS regions
  • Configure automatic deletion of old versions

This hands-on guide covered the complete process of hosting a static website on Amazon S3 and configuring automated backup using lifecycle rules. Both of these tasks form the foundation of real-world cloud workloads and help learners build confidence in cloud storage, automation, and deployment strategies. By understanding these concepts, you can easily create scalable, high-performance, and cost-efficient cloud infrastructure suitable for enterprise and production environments.

Related Tutorials

Frequently Asked Questions for AWS

An AWS Region is a geographical area with multiple isolated availability zones. Regions ensure high availability, fault tolerance, and data redundancy.

AWS EBS (Elastic Block Store) provides block-level storage for use with EC2 instances. It's ideal for databases and other performance-intensive applications.



  • S3: Object storage for unstructured data.
  • EBS: Block storage for structured data like databases.

  • Regions are geographic areas.
  • Availability Zones are isolated data centers within a region, providing high availability for your applications.

AWS pricing follows a pay-as-you-go model. You pay only for the resources you use, with options like on-demand instances, reserved instances, and spot instances to optimize costs.



AWS S3 (Simple Storage Service) is an object storage service used to store and retrieve any amount of data from anywhere. It's ideal for backup, data archiving, and big data analytics.



Amazon RDS (Relational Database Service) is a managed database service supporting engines like MySQL, PostgreSQL, Oracle, and SQL Server. It automates tasks like backups and updates.



  • Scalability: Resources scale based on demand.
  • Cost-efficiency: Pay-as-you-go pricing.
  • Global Reach: Availability in multiple regions.
  • Security: Advanced encryption and compliance.
  • Flexibility: Supports various workloads and integrations.

AWS Auto Scaling automatically adjusts the number of compute resources based on demand, ensuring optimal performance and cost-efficiency.

The key AWS services include:


  • EC2 (Elastic Compute Cloud) for scalable computing.
  • S3 (Simple Storage Service) for storage.
  • RDS (Relational Database Service) for databases.
  • Lambda for serverless computing.
  • CloudFront for content delivery.

AWS CLI (Command Line Interface) is a tool for managing AWS services via commands. It provides scripting capabilities for automation.

Amazon EC2 is a web service that provides resizable compute capacity in the cloud. It enables you to launch virtual servers and manage your computing resources efficiently.

AWS Snowball is a physical device used for data migration. It allows organizations to transfer large amounts of data into AWS quickly and securely.

AWS CloudWatch is a monitoring service that collects and tracks metrics, logs, and events, helping you gain insights into your AWS infrastructure and applications.



AWS (Amazon Web Services) is a comprehensive cloud computing platform provided by Amazon. It offers on-demand cloud services such as compute power, storage, databases, networking, and more.



Elastic Load Balancer (ELB) automatically distributes incoming traffic across multiple targets (e.g., EC2 instances) to ensure high availability and fault tolerance.

Amazon VPC (Virtual Private Cloud) allows you to create a secure, isolated network within the AWS cloud, enabling you to control IP ranges, subnets, and route tables.



Route 53 is a scalable DNS (Domain Name System) web service by AWS. It connects user requests to your applications hosted on AWS resources.

AWS CloudFormation is a service that enables you to manage and provision AWS resources using infrastructure as code. It automates resource deployment through JSON or YAML templates.



AWS IAM (Identity and Access Management) allows you to control access to AWS resources securely. You can define user roles, permissions, and policies to ensure security and compliance.



  • EC2: Provides virtual servers for full control of your applications.
  • Lambda: Offers serverless computing, automatically running your code in response to events without managing servers.

Elastic Beanstalk is a PaaS (Platform as a Service) offering by AWS. It simplifies deploying and managing applications by automatically handling infrastructure provisioning and scaling.



Amazon SQS (Simple Queue Service) is a fully managed message queuing service that decouples and scales distributed systems.

AWS ensures data security through encryption (both at rest and in transit), compliance with standards (e.g., ISO, SOC, GDPR), and access controls using IAM.

AWS Lambda is a serverless computing service that lets you run code in response to events without provisioning or managing servers. You pay only for the compute time consumed.



AWS Identity and Access Management controls user access and permissions securely.

A serverless compute service running code automatically in response to events.

A Virtual Private Cloud for isolated AWS network configuration and control.

Automates resource provisioning using infrastructure as code in AWS.

A monitoring tool for AWS resources and applications, providing logs and metrics.

A virtual server for running applications on AWS with scalable compute capacity.

Distributes incoming traffic across multiple targets to ensure fault tolerance.

A scalable object storage service for backups, data archiving, and big data.

EC2, S3, RDS, Lambda, VPC, IAM, CloudWatch, DynamoDB, CloudFront, and ECS.

Tracks user activity and API usage across AWS infrastructure for auditing.

A managed relational database service supporting multiple engines like MySQL, PostgreSQL, and Oracle.

An isolated data center within a region, offering high availability and fault tolerance.

A scalable Domain Name System (DNS) web service for domain management.

Simple Notification Service sends messages or notifications to subscribers or other applications.

Brings native AWS services to on-premises locations for hybrid cloud deployments.

Automatically adjusts compute capacity to maintain performance and reduce costs.

Amazon Machine Image contains configuration information to launch EC2 instances.

Elastic Block Store provides block-level storage for use with EC2 instances.

Simple Queue Service enables decoupling and message queuing between microservices.

A serverless compute engine for containers running on ECS or EKS.

Manages and groups multiple AWS accounts centrally for billing and access control.

Distributes incoming traffic across multiple EC2 instances for better performance.

A tool for visualizing, understanding, and managing AWS costs and usage over time.

line

Copyrights © 2024 letsupdateskills All rights reserved