Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: Detection of alive hosts #1172

Open
1 task done
psyray opened this issue Jan 16, 2024 · 2 comments
Open
1 task done

bug: Detection of alive hosts #1172

psyray opened this issue Jan 16, 2024 · 2 comments
Assignees
Labels
bug Something isn't working top-priority

Comments

@psyray
Copy link
Contributor

psyray commented Jan 16, 2024

Is there an existing issue for this?

  • I have searched the existing issues

Current Behavior

@yogeshojha @AnonymousWP

According to this issue #1056 and some investigation on my side, I think we have a problem with the detection of alive hosts.

I explain

This piece of code is used to check if an endpoint is alive

# If is_alive is True, select only endpoints that are alive
if is_alive:
endpoints = [e for e in endpoints if e.is_alive]

The main problem with this check is that it is used as the base check to launch scan like

  • dir_file_fuzz

    rengine/web/reNgine/tasks.py

    Lines 1620 to 1628 in 52b4baa

    # Grab URLs to fuzz
    urls = get_http_urls(
    is_alive=True,
    ignore_files=False,
    write_filepath=input_path,
    get_only_default_urls=True,
    ctx=ctx
    )
    logger.warning(urls)
  • fetch_url

    rengine/web/reNgine/tasks.py

    Lines 1748 to 1754 in 52b4baa

    urls = get_http_urls(
    is_alive=enable_http_crawl,
    write_filepath=input_path,
    exclude_subdomains=exclude_subdomains,
    get_only_default_urls=True,
    ctx=ctx
    )
  • vulnerability_scan

So the method get_http_urls is mandatory to launch scan of the above type.

The main problem comes from the is_alive method of the Endpoint class in the startScan model

def is_alive(self):
return self.http_status and (0 < self.http_status < 500) and self.http_status != 404

As you can see, if, in those conditions :

  • an URL returns 404 or http status code above 500
  • get_only_default_urls option is set to true
  • No default URL has been set (the default URL are set only in the target scan, not subdomain scan)
    endpoint, _ = save_endpoint(
    http_url,
    ctx=ctx,
    crawl=enable_http_crawl,
    is_default=True,
    subdomain=subdomain
    )

No base url is returned, so no scan is launched.

It's problematic because dir_file_fuzz could be launched even if the base endpoint returned 404, and it's the same thing for fetch_url and vulnerability_scan

So we need to rework this part to always send to some tools the base URL, and also correctly set the

Expected Behavior

From the moment we have a subdomain, that have an IP, and give some HTTP response, we must run :

  • dir_file_fuzz
  • vulnerability_scan
  • fetch_url

Steps To Reproduce

Try to launch a scan on a website that have the base URL responding HTTP status code >= 500 or 404

Environment

- reNgine: 2.0.2
- OS: debian
- Python: 3.10

Workaround

As a workaround to launch scan :

  • Set enable_http_crawl to false in the launched scanEngine could help launching scan with those HTTP status returned cases
'enable_http_crawl': false,
  • Also, using the admin endpoints and setting is_default to Yes on the base endpoint could solve the issue
    image

Modifying those 2 values could be needed

@psyray psyray added the bug Something isn't working label Jan 16, 2024
Copy link

👋 Hi @psyray,
Issues is only for reporting a bug/feature request. Please read documentation before raising an issue https://rengine.wiki
For very limited support, questions, and discussions, please join reNgine Discord channel: https://discord.gg/azv6fzhNCE
Please include all the requested and relevant information when opening a bug report. Improper reports will be closed without any response.

@psyray psyray self-assigned this Jan 16, 2024
@Talanor
Copy link
Contributor

Talanor commented May 12, 2024

@psyray Do you have any update or ETA on this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working top-priority
2 participants