Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: <0 vuln found and keep in progress forever> #1446

Open
1 task done
lintianyuan666 opened this issue Sep 23, 2024 · 9 comments
Open
1 task done

bug: <0 vuln found and keep in progress forever> #1446

lintianyuan666 opened this issue Sep 23, 2024 · 9 comments
Labels
bug Something isn't working

Comments

@lintianyuan666
Copy link

Is there an existing issue for this?

  • I have searched the existing issues

Current Behavior

I scaned many targets but no vuln found while zap find many vulns.And the scan keeps in progress for 2 days.It is only 1 target.

Expected Behavior

I hope there would be at least 1 vuln

Steps To Reproduce

1、version is 2.2.0
2、target is perceptyx.com

Environment

- reNgine: 2.2.0
- OS: ubantu 22
- Python: 3.10
- Docker Engine: 27.2.1
- Docker Compose: none.
- Browser: chrome 128.0.6613.138

Anything else?

No response

@lintianyuan666 lintianyuan666 added the bug Something isn't working label Sep 23, 2024
Copy link
Contributor

Hey @lintianyuan666! 👋 Thanks for flagging this bug! 🐛🔍

You're our superhero bug hunter! 🦸‍♂️🦸‍♀️ Before we suit up to squash this bug, could you please:

📚 Double-check our documentation: https://rengine.wiki
🕵️ Make sure it's not a known issue
📝 Provide all the juicy details about this sneaky bug

Once again - thanks for your vigilance! 🛠️🚀

@yogeshojha
Copy link
Owner

@lintianyuan666 does the recon find any http URLs at least?

@lintianyuan666
Copy link
Author

@lintianyuan666 does the recon find any http URLs at least?

nothing found.
屏幕截图 2024-09-23 220222
I restart the scan a hour ago.But i have scaned this domain for 3 days,it still have the result of this picture.

@yogeshojha
Copy link
Owner

Stragne, I checked it just with subfinder, and target looks fine. I am now checking in reNgine. Do you have proxy setup or VPN? What does your yaml config look like, I would like to see

@lintianyuan666
Copy link
Author

Stragne, I checked it just with subfinder, and target looks fine. I am now checking in reNgine. Do you have proxy setup or VPN? What does your yaml config look like, I would like to see

Thanks for attention.I didn't use proxy or vpn.Here is my yaml config
root@adfcc:/soft/rengine# cat default_yaml_config.yaml

Global vars for all tools

custom_headers: ['Foo: bar', 'User-Agent: Anything'] # FFUF, Nuclei, Dalfox, CRL Fuzz, HTTP Crawl, Fetch URL, etc

enable_http_crawl: true # All tools

timeout: 10 # Subdomain discovery, Screenshot, Port scan, FFUF, Nuclei

threads: 30 # All tools

rate_limit: 150 # Port scan, FFUF, Nuclei

intensity: 'normal' # Screenshot (grab only the root endpoints of each subdomain), Nuclei (reduce number of endpoints to scan), OSINT (not implemented yet)

retries: 1 # Nuclei

subdomain_discovery: {
'uses_tools': ['subfinder', 'ctfr', 'sublist3r', 'tlsx', 'oneforall', 'netlas'], # amass-passive, amass-active, All
'enable_http_crawl': true,
'threads': 30,
'timeout': 5,

'use_subfinder_config': false,

'use_amass_config': false,

'amass_wordlist': 'deepmagic.com-prefixes-top50000'

}
http_crawl: {

'threads': 30,

'follow_redirect': true

}
port_scan: {
'enable_http_crawl': true,
'timeout': 5,

'exclude_ports': [],

'exclude_subdomains': [],

'ports': ['top-100'],
'rate_limit': 150,
'threads': 30,
'passive': false,

'use_naabu_config': false,

'enable_nmap': true,

'nmap_cmd': '',

'nmap_script': '',

'nmap_script_args': ''

}
osint: {
'discover': [
'emails',
'metainfo',
'employees'
],
'dorks': [
'login_pages',
'admin_panels',
'dashboard_pages',
'stackoverflow',
'social_media',
'project_management',
'code_sharing',
'config_files',
'jenkins',
'wordpress_files',
'php_error',
'exposed_documents',
'db_files',
'git_exposed'
],

'custom_dorks': [],

'intensity': 'normal',
'documents_limit': 50
}
dir_file_fuzz: {
'auto_calibration': true,
'enable_http_crawl': true,
'rate_limit': 150,
'extensions': ['html', 'php','git','yaml','conf','cnf','config','gz','env','log','db','mysql','bak','asp','aspx','txt','conf','sql','json','yml','pdf'],
'follow_redirect': false,
'max_time': 0,
'match_http_status': [200, 204],
'recursive_level': 2,
'stop_on_error': false,
'timeout': 5,
'threads': 30,
'wordlist_name': 'dicc'
}
fetch_url: {
'uses_tools': ['gospider', 'hakrawler', 'waybackurls', 'katana', 'gau'],
'remove_duplicate_endpoints': true,
'duplicate_fields': ['content_length', 'page_title'],
'enable_http_crawl': true,
'gf_patterns': ['debug_logic', 'idor', 'interestingEXT', 'interestingparams', 'interestingsubs', 'lfi', 'rce', 'redirect', 'sqli', 'ssrf', 'ssti', 'xss'],
'ignore_file_extensions': ['png', 'jpg', 'jpeg', 'gif', 'mp4', 'mpeg', 'mp3'],
'threads': 30,

'exclude_subdomains': false

}
vulnerability_scan: {
'run_nuclei': true,
'run_dalfox': false,
'run_crlfuzz': false,
'run_s3scanner': false,
'enable_http_crawl': true,
'concurrency': 50,
'intensity': 'normal',
'rate_limit': 150,
'retries': 1,
'timeout': 5,
'fetch_gpt_report': true,
'nuclei': {
'use_nuclei_config': false,
'severities': ['unknown', 'info', 'low', 'medium', 'high', 'critical'],
# 'tags': [], # Nuclei tags (https://github.com/projectdiscovery/nuclei-templates)
# 'templates': [], # Nuclei templates (https://github.com/projectdiscovery/nuclei-templates)
# 'custom_templates': [] # Nuclei custom templates uploaded in reNgine
}
}
waf_detection: {
'enable_http_crawl': true
}
screenshot: {
'enable_http_crawl': true,
'intensity': 'normal',
'timeout': 10,
'threads': 40
}

This is my yaml config when I select full scan when initializing the scan on the web page

subdomain_discovery: {
'uses_tools': ['subfinder', 'chaos', 'ctfr', 'sublist3r', 'tlsx', 'oneforall', 'netlas'],
'enable_http_crawl': true,
'threads': 30,
'timeout': 5,
}
http_crawl: {}
port_scan: {
'enable_http_crawl': true,
'timeout': 5,

'exclude_ports': [],

'exclude_subdomains': [],

'ports': ['top-100'],
'rate_limit': 150,
'threads': 30,
'passive': false,

'use_naabu_config': false,

'enable_nmap': true,

'nmap_cmd': '',

'nmap_script': '',

'nmap_script_args': ''

}
osint: {
'discover': [
'emails',
'metainfo',
'employees'
],
'dorks': [
'login_pages',
'admin_panels',
'dashboard_pages',
'stackoverflow',
'social_media',
'project_management',
'code_sharing',
'config_files',
'jenkins',
'wordpress_files',
'php_error',
'exposed_documents',
'db_files',
'git_exposed'
],
'intensity': 'normal',
'documents_limit': 50
}
dir_file_fuzz: {
'auto_calibration': true,
'enable_http_crawl': true,
'rate_limit': 150,
'extensions': ['html', 'php','git','yaml','conf','cnf','config','gz','env','log','db','mysql','bak','asp','aspx','txt','conf','sql','json','yml','pdf'],
'follow_redirect': false,
'max_time': 0,
'match_http_status': [200, 204],
'recursive_level': 2,
'stop_on_error': false,
'timeout': 5,
'threads': 30,
'wordlist_name': 'dicc'
}
fetch_url: {
'uses_tools': ['gospider', 'hakrawler', 'waybackurls', 'katana', 'gau'],
'remove_duplicate_endpoints': true,
'duplicate_fields': ['content_length', 'page_title'],
'enable_http_crawl': true,
'gf_patterns': ['debug_logic', 'idor', 'interestingEXT', 'interestingparams', 'interestingsubs', 'lfi', 'rce', 'redirect', 'sqli', 'ssrf', 'ssti', 'xss'],
'ignore_file_extensions': ['png', 'jpg', 'jpeg', 'gif', 'mp4', 'mpeg', 'mp3'],
'threads': 30
}
vulnerability_scan: {
'run_nuclei': true,
'run_dalfox': true,
'run_crlfuzz': true,
'enable_http_crawl': true,
'concurrency': 50,
'intensity': 'normal',
'rate_limit': 150,
'retries': 1,
'timeout': 5,
'fetch_gpt_report': false,
'nuclei': {
'use_nuclei_config': false,
'severities': ['unknown', 'info', 'low', 'medium', 'high', 'critical']
}
}
waf_detection: {

}
screenshot: {
'enable_http_crawl': true,
'intensity': 'normal',
'timeout': 10,
'threads': 40
}

custom_headers: ["Cookie: Test"]

@ncharron
Copy link

Can you post your celery-entrypoint.sh file? Only lines roughly 160 to 205.

@lintianyuan666
Copy link
Author

Maybe it's because my full scan is misconfigured. Vulnerabilities can be scanned using the recommended configuration

@rezytijo
Copy link

I have the same issue. its happen on all my server, i use ARM VPS on Oracle and my Kali VM on my home PC
image

I dont use any proxy or VPN, any recomendations?

@rezytijo
Copy link

i found something here
image

there is file call urls_unfurled.txt i didn't find this file when opening docker volume, i think this is the source of the problem
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants