-
Notifications
You must be signed in to change notification settings - Fork 31
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
-1:CON error on internal sites #26
Comments
Hi @BuscheIT, Information about nslookup failing on Windows is my fault. Nslookup is not available on Windows/CygWin, so this analysis should not even be performed properly on Windows. It's on the roadmap for future fixes.
|
Hello, Of course we tried with https as well - our .test environment has that and we just went for http to avoid self signed cert trouble - as stated. Also disabled Windows Defender temporarily as no request seems to even hit the .test server. Crawler is the first program giving us such trouble - never had problems with other network depending tools, so really would love to figure this out. Linux screenshot soon - ideas for wireshark or gdb or cmdline options debugging? PS: Using the standard options after startup but disable "allow images" to reduce requests a little - shouldnt matter. |
Hi @ovenum, as part of the work on this issue I have made a number of improvements in the last few commits. If you know how to work with Git, you can run the current version from the "main" branch, or wait 2-3 weeks when I release version 1.0.9.
If you can test the current version from the main branch, please test it and let me know if everything important is running fine. |
Thanks @janreges for looking into this. Just got the latest version from git and added swoole-cli 6.0.0 macos-arm64. Running Using the resolve parameter solves the issue Let me know if you require more information regarding the issue. Attached is the report of the failed crawl with -1:CON error
|
Hi @ovenum, can you please describe how you have dnsmasq installed/configured in your system? Send for example the contents of Please also send the output of SiteOne Crawler uses Swoole because of its high performance. From what I've seen, Swoole internally uses the https://c-ares.org/ library for performance, and it's quite likely that this library doesn't implement all the possible ways that different operating systems can offer for "overriding standard DNS functionality". |
Sure. brew install dnsmasq
echo 'address=/.test/127.0.0.1' > /opt/homebrew/etc/dnsmasq.conf
sudo mkdir -v /etc/resolver
sudo bash -c 'echo "nameserver 127.0.0.1" > /etc/resolver/test' Here’s the output of the commands you’ve requested. Also attached is the output for
|
#10 describes the same error we are currently trying to resolve.
Our internal network is using a Windows DNS controller - all domain names resolve nicely on both of the test PCs we are using (Win11 and Linux Mint).
On both machines we are getting the -1:CON error and see no requests in the server logs.
Using WIN11 (with 1.0.8 portable) the report has as first line: "Problem with DNS analysis: Crawler\Analysis\DnsAnalyzer::getDnsInfo: nslookup command failed."
The same empty report under linux (Snap) just without the nslookup command error.
All EXTERNAL sites can be scanned without problems - there is no proxy internally, all browsers work internally and we are using plain HTTP to avoid probable certificate trouble.
Any ideas?
The text was updated successfully, but these errors were encountered: