After finishing bandit
from overthewire.org
, I wanted to try out another challenge but this time, focusing on the cloud. While looking for such challenge, I stumbled upon flaws.cloud.
What is flaws.cloud?
From the site’s main page:
Through a series of levels you’ll learn about common mistakes and gotchas when using Amazon Web Services (AWS). There are no SQL injection, XSS, buffer overflows, or many of the other vulnerabilities you might have seen before. As much as possible, these are AWS specific issues.
A series of hints are provided that will teach you how to discover the info you’ll need. If you don’t want to actually run any commands, you can just keep following the hints which will give you the solution to the next level. At the start of each level you’ll learn how to avoid the problem the previous level exhibited.
Basically, this site focuses on security aspects of AWS. Since a majority of sites are now hosted in AWS, learning vulnerabilities, misconfigurations, and hardening for AWS instances becomes very important.
Huge thanks to Mr. Scott Piper of Summit Route for creating this site.
Challenges
I will be going through the challenge/ site and detailing the steps in this page. Though most of the time, I would probably only be following thru the hints, I would still like to try to solve each level by myself.
Scope
Everything is run out of a single AWS account, and all challenges are sub-domains of flaws.cloud
.
Level 1
This level is buckets of fun. See if you can find the first sub-domain.
Hint 1
The site flaws.cloud is hosted as an S3 bucket. This is a great way to host a static site, similar to hosting one via github pages. Some interesting facts about S3 hosting: When hosting a site as an S3 bucket, the bucket name (flaws.cloud) must match the domain name (flaws.cloud). Also, S3 buckets are a global name space, meaning two people cannot have buckets with the same name. The result of this is you could create a bucket named apple.com and Apple would never be able host their main site via S3 hosting.
You can determine the site is hosted as an S3 bucket by running a DNS lookup on the domain, such as:
dig +nocmd flaws.cloud any +multiline +noall +answer
# Returns:
# flaws.cloud. 5 IN A 54.231.184.255
Visiting
*54.231.184.255*
in your browser will direct you to https://aws.amazon.com/s3/So you know
flaws.cloud
is hosted as an S3 bucket.You can then run:
nslookup 54.231.184.255
# Returns:
# Non-authoritative answer:
# 255.184.231.54.in-addr.arpa name = s3-website-us-west-2.amazonaws.com
So we know it’s hosted in the AWS region
us-west-2
.Side note (not useful for this game): All S3 buckets, when configured for web hosting, are given an AWS domain you can use to browse to it without setting up your own DNS. In this case,
flaws.cloud
can also be visited by going to[http://flaws.cloud.s3-website-us-west-2.amazonaws.com/](http://flaws.cloud.s3-website-us-west-2.amazonaws.com/)
What will help you for this level is to know its permissions are a little loose.
Hint 2
You now know that we have a bucket named flaws.cloud in us-west-2, so you can attempt to browse the bucket by using the aws cli by running:
aws s3 ls s3://flaws.cloud/ --no-sign-request --region us-west-2
If you happened to not know the region, there are only a dozen regions to try. You could also use the GUI tool cyberduck to browse this bucket and it will figure out the region automatically.
Finally, you can also just visit http://flaws.cloud.s3.amazonaws.com/
which lists the files due to the permissions issues on this bucket.
Opening the link stated in the above hint will show us an XML file
.
This XML file does not appear to have any style information associated with it. The document tree is shown below.
<ListBucketResult>
<Name>flaws.cloud</Name>
<MaxKeys>1000</MaxKeys>
<IsTruncated>false</IsTruncated>
<Contents>
<Key>hint1.html</Key>
<LastModified>2017-03-14T03:00:38.000Z</LastModified>
<ETag>"f32e6fbab70a118cf4e2dc03fd71c59d"</ETag>
<Size>2575</Size>
<StorageClass>STANDARD</StorageClass>
</Contents>
<Contents>
<Key>hint2.html</Key>
<LastModified>2017-03-03T04:05:17.000Z</LastModified>
<ETag>"565f14ec1dce259789eb919ead471ab9"</ETag>
<Size>1707</Size>
<StorageClass>STANDARD</StorageClass>
</Contents>
<Contents>
<Key>hint3.html</Key>
<LastModified>2017-03-03T04:05:11.000Z</LastModified>
<ETag>"ffe5dc34663f83aedaffa512bec04989"</ETag>
<Size>1101</Size>
<StorageClass>STANDARD</StorageClass>
</Contents>
<Contents>
<Key>index.html</Key>
<LastModified>2020-05-22T18:16:45.000Z</LastModified>
<ETag>"f01189cce6aed3d3e7f839da3af7000e"</ETag>
<Size>3162</Size>
<StorageClass>STANDARD</StorageClass>
</Contents>
<Contents>
<Key>logo.png</Key>
<LastModified>2018-07-10T16:47:16.000Z</LastModified>
<ETag>"0623bdd28190d0583ef58379f94c2217"</ETag>
<Size>15979</Size>
<StorageClass>STANDARD</StorageClass>
</Contents>
<Contents>
<Key>robots.txt</Key>
<LastModified>2017-02-27T01:59:28.000Z</LastModified>
<ETag>"9e6836f2de6d6e6691c78a1902bf9156"</ETag>
<Size>46</Size>
<StorageClass>STANDARD</StorageClass>
</Contents>
<Contents>
<Key>secret-dd02c7c.html</Key>
<LastModified>2017-02-27T01:59:30.000Z</LastModified>
<ETag>"c5e83d744b4736664ac8375d4464ed4c"</ETag>
<Size>1051</Size>
<StorageClass>STANDARD</StorageClass>
</Contents>
</ListBucketResult>
Please note the value contained in the last <key>
tag. If you append that in the flaws.cloud
URI, you will get the URI for the next level.
Congrats! You found the secret file!
Level 2 is at http://level2-c8b217a33fcf1f839f6f1f73a00a9ae7.flaws.cloud
Hint 3
Level 1: Hint 3
At this point, you should have found a file that will tell you to go to the sub-domain http://level2-c8b217a33fcf1f839f6f1f73a00a9ae7.flaws.cloud
Lesson learned
On AWS you can set up S3 buckets with all sorts of permissions and functionality including using them to host static files. A number of people accidentally open them up with permissions that are too loose. Just like how you shouldn’t allow directory listings of web servers, you shouldn’t allow bucket listings.
Examples of this problem
Directory listing of S3 bucket of Legal Robot and Shopify.Read and write permissions to S3 bucket for Shopify again and Udemy. This challenge did not have read and write permissions, as that would destroy the challenge for other players, but it is a common problem.
Avoiding the mistake
By default, S3 buckets are private and secure when they are created. To allow it to be accessed as a web page, I had turn on “Static Website Hosting” and changed the bucket policy to allow everyone “s3:GetObject” privileges, which is fine if you plan to publicly host the bucket as a web page. But then to introduce the flaw, I changed the permissions to add “Everyone” to have “List” permissions.
“Everyone” means everyone on the Internet. You can also list the files simply by going to flaws.cloud.s3.amazonaws.com due to that List permission.
Level 2
The next level is fairly similar, with a slight twist. You’re going to need your own AWS account for this. You just need the free tier.
Hint 1
You need your own AWS key, and you need to use the AWS CLI. Similar to the first level, you can discover that this sub-domain is hosted as an S3 bucket with the name =”level2-c8b217a33fcf1f839f6f1f73a00a9ae7.flaws.cloud”=.
Its permissions are too loose, but you need your own AWS account to see what’s inside. Using your own account you can run:
aws s3 --profile YOUR_ACCOUNT ls s3://level2-c8b217a33fcf1f839f6f1f73a00a9ae7.flaws.cloud
How to create your AWS Free Tier account
- Creating a IAM user on AWS: Within AWS Dashboard search for IAM. You may need to provide a phone number and a credit card number.
- Add user under “Users”.
- create Username and select access key.
- Attach to group e..g.
AdminS3
to which can be created with “Create Group”. - Add additonal tags if need be for organization.
- Review and create user.
- Important — the Secret Access Key will ONLY be displayed at this point and if lost will need to be regenerated.
Configure aws on Mac or Linux
You can install awscli
from Homebrew
or from your distro’s package manager. You can also install from source.
#homebrew for mac osxbrew install awscli
#for debian-based distrossudo apt install awscli
#installing from source
curl "[https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip](https://awscli.amazonaws.com/awscli-exe-linux-x86%5C_64.zip)" -o "awscliv2.zip"
unzip awscliv2.zip
sudo ./aws/install
Type below command in your terminal.
aws configure
Enter AWS Access Key ID:
Enter AWS Secret Access Key:
Enter Region: us-east-1
Enter Default Output: json
Default text file location with parameters can be found at:
~/.aws/config
~/.aws/credentials
You can now access the S3 bucket:
aws s3 --profile default ls s3://level2-c8b217a33fcf1f839f6f1f73a00a9ae7.flaws.cloud
2017-02-27 10:02:15 80751 everyone.png
2017-03-03 11:47:17 1433 hint1.html
2017-02-27 10:04:39 1035 hint2.html
2017-02-27 10:02:14 2786 index.html
2017-02-27 10:02:14 26 robots.txt
2017-02-27 10:02:15 1051 secret-e4443fc.html
Append the secret-e4443fc.html
to the http://level2-c8b217a33fcf1f839f6f1f73a00a9ae7.flaws.cloud/
and it will show the correct URI for the next level.
Level 2: Hint 2
The next level is at http://level3-9afd3927f195e10225021a578e6f78df.flaws.cloud
Lesson learned
Similar to opening permissions to Everyone, people accidentally open permissions to Any Authenticated AWS User. They might mistakenly think this will only be users of their account, when in fact it means anyone that has an AWS account.
Examples of this problem
Open permissions for authenticated AWS user on Shopify.
Avoiding the mistake
Only open permissions to specific AWS users. This screenshot is from the webconsole in 2017. This setting can no longer be set in the webconsole, but the SDK and third-party tools sometimes allow it.
This screenshot is from the webconsole in 2017. This setting can no longer be set in the webconsole, but the SDK and third-party tools sometimes allow it.