ExamGecko
Question list
Search
Search

Related questions











Question 544 - 312-50v12 discussion

Report
Export

In the process of footprinting a target website, an ethical hacker utilized various tools to gather critical information. The hacker encountered a target site where standard web spiders were ineffective due to a specific file in its root directory. However, they managed to uncover all the files and web pages on the target site, monitoring the resulting incoming and outgoing traffic while browsing the website manually. What technique did the hacker likely employ to achieve this?

A.
Using Photon to retrieve archived URLs of the target website from archive.org
Answers
A.
Using Photon to retrieve archived URLs of the target website from archive.org
B.
Using the Netcraft tool to gather website information
Answers
B.
Using the Netcraft tool to gather website information
C.
Examining HTML source code and cookies
Answers
C.
Examining HTML source code and cookies
D.
User-directed spidering with tools like Burp Suite and WebScarab
Answers
D.
User-directed spidering with tools like Burp Suite and WebScarab
Suggested answer: D

Explanation:

User-directed spidering is a technique that allows the hacker to manually browse the target website and use a proxy or spider tool to capture and analyze the traffic. This way, the hacker can discover hidden or dynamic content that standard web spiders may miss due to a specific file in the root directory, such as robots.txt, that instructs them not to crawl certain pages or directories. User-directed spidering can also help the hacker to bypass authentication or authorization mechanisms, as well as identify vulnerabilities or sensitive information in the target website. User-directed spidering can be performed with tools like Burp Suite and WebScarab, which are web application security testing tools that can intercept, modify, and replay HTTP requests and responses, as well as perform various attacks and scans on the target website.

The other options are not likely to achieve the same results as user-directed spidering. Using Photon to retrieve archived URLs of the target website from archive.org may provide some historical information about the website, but it may not reflect the current state or content of the website. Using the Netcraft tool to gather website information may provide some general information about the website, such as its IP address, domain name, server software, or hosting provider, but it may not reveal the specific files or web pages on the website. Examining HTML source code and cookies may provide some clues about the website's structure, functionality, or user preferences, but it may not expose the hidden or dynamic content that user-directed spidering can discover.Reference:

User Directed Spidering with Burp

Web Spidering - What Are Web Crawlers & How to Control Them

Web Security: Recon

Mapping the Application for Penetrating Web Applications --- 1

asked 18/09/2024
Lee, Eduardo
48 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first