NahamCon CTF 2024 Walkthrough: All About Robots
I recently had some spare time and decide to knock out a couple of the challenges for the NahamCon CTF (heavily advertised by John Hammond on his LinkedIn) and post the walkthroughs on here.
Once the challenge is started we're pointed to a URL for a website titled "All About Robots"
For this, it's clear it's a challenge about robots.txt, but for those who haven't caught on, clicking on each robot will describe a bit more about the specific robot, with the "Learn More" button leading to https://www.robotstxt.org/, a resource for using robots.txt for websites.
For those who don't know, the robots.txt
file is a standard used by websites to instruct web crawlers which pages or sections should not be crawled or indexed. It helps manage web traffic, protect sensitive information, and control search engine indexing by specifying disallowed paths. Web crawlers check robots.txt
for instructions before accessing a site.
However, robots.txt
can also be a security weakness. By listing directories and files to be excluded from indexing, it inadvertently highlights potentially sensitive areas of a website to malicious actors. Adversaries can review robots.txt
to find and target restricted sections, making it a valuable reconnaissance tool in cyberattacks.
Back at the "All About Robots" webpage, we can simply modify our URL to path to the robots.txt file. Navigating to the file in our web browser shows us the user agent information and a disallow to /open_the_pod_bay_doors_hal_and_give_me_the_flag.html
Okay, now we know where the flag actually is. Let's path there.
With a congratulatory confetti blast, we are given the flag and are able to keep going to different challenges in the CTF.