AWS Basics

Who’s the largest cloud provider out there again?

Unit Goals

To become familiar with what AWS provides and gain a working knowledge of a few of its basic services.

What We Will Cover

This will be a technical introduction. We will not go through the history of AWS, the company, how much money it makes, who its clients are, and so on. That information is publicly available.

Let’s see how to use it.

Getting Started

There are many resources on the web for getting started and you should use those. We’ll assume you are all signed up.

Before building something, get to know what kind of services AWS can provide. As of April, 2019, AWS provides well over 100 services. Here are just a few of the popular ones:

ComputeEC2(Elastic Compute Cloud) Create and use (virtual) machines.
LambdaFunctions (Serverless)
ECS(Elastic Container Service) Manage Docker containers
FargateContainers without worrying about servers
BatchBatch Jobs
LightsailVirtual Private Servers, good for getting started
Elastic BeanstalkFor running webapps
StorageS3(Simple Storage Service) Store objects in buckets
EBS(Elastic Block Store) Block storage for EC2 instances
EFS(Elastic File System) Fully managed file sytem for EC2
GlacierLow-cost archival storage
BackupConfigure backup strategies
DatabaseRDSThe standard relational database service
AuroraHigh-performance, managed relational database
DynamoDBManaged NoSQL database
DocumentDBManaged Mongo-compatible document database
ElastiCacheIn-memory (key-value) caching system
RedshiftData warehouse
NeptuneManaged graph database
TimestreamManaged time series database
Security, Identity,
and Compliance
IAM(Identity and Access Management) Identity management and roles for the AWS account
CognitoIdentity management for apps
GuardDutyThreat detection service
InspectorApplication security analyzer
Certificate ManagerSS:/TLS Certificates
Firewall ManagerManage firewall rules
Secrets ManagerManage (including rotatation of) secrets
ShieldDDos protection
WAFFilters malicious traffic
Networking and
Content Delivery
VPCVirtual Private Cloud
CloudFrontGlobal CDN
Route 53DNS
API GatewayFor API deployment
ELBElastic Load Balancer
Application IntegrationSQSSimple Queue Service
SNSSimple Notification Service
Machine LearningSageMakerBuild, train, and deploy large ML models
LexVoice and chatbots
PollyText to speech
TranscribeSpeech recognition
RekognitionImage and text recognition and analysis
ComprehendDiscover insights and relationships in text
TranslateLanguage translation
TextractExtract text and data from documents
AnalyticsAthenaQuery S3
CloudSearchManaged Search Service
ElasticsearchManage Elasticsearch clusters
KinesisReal-time data streams
EMRElastic MapReduce
QuicksightBusiness Analytics Service
Management and
CloudFormationAutomate infrastructure creation
CloudWatchMonitoring and logging
CloudTrailTrack user activity and API usage

Other categories: Developer Tools, Business Applications, Game Tech, Internet of Things, Media Services, Robotics, Blockchain, Customer Management, Mobile, End User Computing, and Satellite.

One other good thing to know right up front. There are three ways to interact with the resources in your AWS account:

We’ll introduce these as needed. But now let’s get to work.

Some Warmup Tasks

We’re not going to cover services and best practices and programming examples one-by-one; instead, we’re just going to do things and learn what we need along the way. And we’re not going to get very sophisticated; a useful tour of AWS would take hours and hours and hours.

Some of these tasks may cost real money

You will need to watch your costs. If you spin up resources, terminate them as soon as you finish with them to keep costs down.

Also, for simplicity, these notes will not say anything about best practices to control costs.

You are on your own here. You can read up on the AWS “Free Tier” and other Amazon docs for running as cheap as possible.

Are you in the LMU class?

Our institution is part of AWS Educate, so you can sign up under the institutional account and get $100 free. The usage is capped, so you still want to be careful, but you have more than enough credits to run through the tasks on this page.
These notes do not comprise an official tutorial

You won’t find any hand-holding here. Just a bunch of bullet points used to guide in-class demonstrations and classwork, so it will be really hard to follow on your own. And this lecture isn’t going on YouTube, so come to class if you’re interested in this stuff.

0. Setup

You shouldn’t really do anything unless you have a security mindset and understand a few basic things. So let’s get all set up.

  1. Your AWS account comes with a root user that has a user name and password. You should not use it, other than the very first time you log in. If this is your first time, or you haven’t made users, login with the root user.
  2. Go to the IAM service.
  3. Create an actual user. For the purposes of our in-class demos, AdministratorAccess is okay. Probably best to create a group first then add the user to it, but you can attach roles to a user directly.
  4. Make your ~/.aws/credentials and ~/.aws/config files. For simplicity, let’s use the default profile. Config file:
    [profile default]
    output = json
    region = us-east-1

    Credentials file (paste the keys you get when you created the user):

    aws_access_key_id = xxxxxxxxxxxxxxxxxxxx
    aws_secret_access_key = xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
  5. Log out of the console so you are not the root user anymore.
  6. Log in using the user you just created. (You will have had to make a note of the signin link.)
  7. Install the awscli.
  8. Test your awscli installation.
    $ aws help

1. Store some data in S3 and retrieve it

S3 stands for Simple Storage Service. It organizes your data into buckets. Each bucket has a globally unique name. Buckets store objects, which can basically be anything. Object names, called keys, can have slashes in them, making it look like your bucket is organized into folders.

  1. In the AWS console, go to the S3 service.
  2. Create a bucket. I called mine programming-languages-logos; you will need to pick another name because bucket names must be universally unique.
  3. Set its properties and access restrictions.
  4. Upload an image in the console. (I uploaded ts-logo.png
  5. Download the image in the console.
  6. On the command line, download your image with
    $ aws s3api get-object --key=ts-logo.png --bucket=programming-languages-logos ts
    This displays some JSON in the console describing the downloaded object, and writes the object itself to a file called ts.
  7. Upload a new image on the command line
    $ aws s3api put-object --key=go-logo.png --bucket=programming-languages-logos --body=go-logo.png
  8. In the AWS console, check that the new image is in the bucket.
  9. On the command line, list the contents of that bucket.
    $ aws s3api list-objects --bucket=programming-languages-logos
  10. You can even use the command line to list all the buckets in your account
    $ aws s3api list-buckets
  11. Write a Python program to list the contents of a bucket.
    import sys
    import boto3
    s3 = boto3.client('s3', 'us-east-1')
    for key in s3.list_objects(Bucket=sys.argv[1])['Contents']:
    $ python programming-languages-logos
  12. Write a Python program to generate a time-bombed URL to an image.
    import sys
    import boto3
    from botocore.client import Config
    s3 = boto3.client('s3', 'us-east-1', config=Config(signature_version='s3v4'))
    params = {'Bucket': sys.argv[1], 'Key': sys.argv[2]}
    $ python programming-languages-logos ts-logo.png
  13. Try that link out in the browser, quickly. You should see the image.
  14. Wait 60 seconds and try the link again. The browser will show an Error document.

2. Spin up an EC2 Box and do some stuff

  1. In the AWS Console, go to the EC2 dashboard.
  2. Launch Instance (choose something “Free-tier eligible” — a t2.micro running Ubuntu should suffice).
  3. Click Review and Launch. On the Review and Launch Page, click on Edit Security Groups.
  4. Choose a name for your SG and add a rule for SSH, MyIP. You can use defaults for everything else (you will get a default VPC, etc.)
  5. Click Launch, which will prompt you for a key pair. It’s best practice for you to create your own key pair and use it, but you can, if you trust AWS, have it generate a key pair for you. If you do, make sure to chmod the private key file to 400 and put it in a safe place on your laptop.
  6. Launch
  7. View instances. Browse all the information on the console. How much do you recognize after a whole semester in the Networks class (and from your earlier Operating Systems class)?
  8. From your laptop command line, ssh into your box. (I don’t know how to do this on Windows, maybe you need PuTTY for all I know....) For me, it was:
    $ ssh -i ~/.ssh/demo.pem
  9. You should be in, so create some folders and files, and run all your favorite command line Unix programs. You probably have Python installed. You can fetch and download Node and Ruby too. Or even an assembler!
  10. Have a friend try to ssh into your box. That should not work. If it does, that’s because you did not set up your security group correctly.
  11. Exit.
  12. If you like, from the EC2 console, terminate the instance. (If you don't, you will incur charges at some point.)

Aside: AWS has a lot of instance types. An example is t3.large where t is the type, 3 is the generation, and large is the size. Sizes are nano, micro, small, medium, large, xlarge, 2xlarge, and lot of different multiples on the xlarge. There are a few types, which are fun to try to remember:

General Purpose
t2, 3, 3aTiny, Turbo Burstable CPU (varying workloads). Good for webapps, microservices, test environments. Can get these for nano upto 2xlarge.
m4, 5, 5a, 5d, 5adMain Good for general purpose enterprise apps, backend, servers, mid-size databases. Best for steadier workloads. Only comes in large and above.
a1ARM Features custom-built AWS Graviton Processor. From medium to 4xlarge. Good for general purpose computing, containers, and development environments.
Compute Optimized
c4, 5, 5a, 5d, 5nCompute For compute-intensive tasks like batch processing, analytics, science and engineering, gaming, video-encoding. The n suffix means better for networking; these instances are good for enhanced networking, for more massively scalable games, etc.
Accelerated Computing
p2, 3, 3dnParallel General purpose massively parallel computing with GPUs with thousands of cores. Good for machine learning, computational fluid dynamics, finance, seismic analysis, molecular modeling, genomics, speech recognition, drug discovery. From xlarge to 24xlarge and beyond.
g2, 3, 3sGraphics Optimized for graphics-intensive applications: 3D visualizations, grahics remote workstations, 3D rendering, application streaming, video encoding.
f1FPGA Hardware acceleration through field programmable gate arrays (FPGAs). Good for genomics, financial analytics, real-time video processing, big data, security.
Memory Optimized
r4, 5, 5a, 5dRAM Optimized for memory-intensive applications, such as in-memory databases, data mining and analysis, in-memory caches, real-time processing of unstructured big data, Hadoop/Spark clusters, and big data analytics.
x1, 1e? Really large in-memory databases, x1e 32xlarge has almost 4TB of memory.
z1d? AWS made this instance to have ultra-fast 4.0GHz cores so their customers could reduce their costs of software such as EDA and databases that have per-core licensing costs.
Storage Optimized
i2, 3, 3enI/O Per-
Optimized for high IOPS with SSD storage. Good for NoSQL databases, scale-out transactional databases, data warehousing, Elasticsearch, analytics workloads. The en versions go up to 100Gbps networking.
h1HDD Up to 16TB local HDD storage. Used for MapReduce, distributed file systems, network file systems, log or data processing applications, and big data workload clusters.
d2Dense Up to 48TB local HDD storage. Used in Massively Parallel Processing (MPP) data warehousing, distributed computing, distributed file systems, network file systems, log processing.

3. Deploy a static website with S3 and CloudFront

Let’s make a trivial hello world type site. If you want to use create-react-app, that’s fine too.

  1. The simplest way is to make a bucket that is really public. AWS will complain that you made the bucket public, but for right now let’s see that it works.
  2. Create the bucket with all those “block” and “remove” settings unchecked. AWS will warn you, but for now, go through with it.
  3. Open the Static Website Hosting box in properties, check it, and say you want index.html as the main page.
  4. Add the bucket policy from these instructions.
  5. Upload your little website.
  6. The URL that it lives at is in the Static Website Hosting box.
Exercise: We should do this with CloudFront instead. Buckets should not be exposed like we just did.

4. Write a serverless function with Lambda

5. Check your logs on CloudWatch

6. Create and use an RDS database

7. Use AWS Translate

You can use the console:


Or the command line:

$ aws translate translate-text --source-language-code=en --target-language-code=es --text='Hello, world!'
    "TranslatedText": "¡Hola, mundo!",
    "SourceLanguageCode": "en",
    "TargetLanguageCode": "es"

Or you can do it from Python also:

import sys
import boto3

client = boto3.client('translate')

response = client.translate_text(
print(response.get('TranslatedText', response))
$ python en de 'Where is the train station?'
Wo ist der Bahnhof?
$ python en ja "Those who break the rules are scum, that's true, but those who abandon their friends are worse than scum."

Building an Infrastructure

TODO - sample architecture, CloudFormation, All the security concerns...


We’ve covered:

  • How one can get started with AWS
  • Deploying a trivial webapp
  • The categories of services
  • Using a few of the popular services
  • Infrastructure concerns