|
Joined: Jun 2006
Posts: 319
Enthusiast
|
Enthusiast
Joined: Jun 2006
Posts: 319 |
Hi all,
I have noticed this in the Whos online Bytespider (bad_bot) has over 100 entries.
Another one is MJ12bot which is trying to login.
Should I do something about these two and if so what code do I need to run and where please.
I have noticed that if you have a lot it can cause issues slowing the forum down.
Any assistance would be appreciated.
|
|
|
|
Joined: Dec 2003
Posts: 6,628 Likes: 85
|
Joined: Dec 2003
Posts: 6,628 Likes: 85 |
If the bot honers it you can create a robots.txt file in your root folder and exclude by agent, ip etc https://en.wikipedia.org/wiki/Robots.txthttps://ubbdev.com/wiki/view/7/ubb-sitemaps.htmlIf they do not honor the robots file you can ban them via the htaccess file or the UBB control panel using ip. If it becomes a ddos attack you need to use something like cloudflare or ask your host about ddos protection.
Blue Man Group There is no such thing as stupid questions. Just stupid answers
|
|
|
|
Joined: Jun 2006
Posts: 16,364 Likes: 126
|
Joined: Jun 2006
Posts: 16,364 Likes: 126 |
FYI, if the bot visits the login page the Who's Online page will say that it's trying to login, since that's the wording for that system.
|
|
|
|
Joined: Jun 2006
Posts: 319
Enthusiast
|
Enthusiast
Joined: Jun 2006
Posts: 319 |
Thanks for the reply. This is what I placed in the robot text file: User-agent: Baiduspider User-agent: 360Spider User-agent: Yisouspider User-agent: PetalBot Disallow: / User-agent: Bytespider Disallow: / User-agent: Sogou web spider Disallow: / User-agent: Sogou inst spider Disallow: / Does this look correct and how long would it take before it starts to work?
Last edited by Outdoorking; 04/13/2024 11:14 PM.
|
|
|
|
Joined: Jun 2006
Posts: 319
Enthusiast
|
Enthusiast
Joined: Jun 2006
Posts: 319 |
Thanks Gizmo,
I would really like to know what code that I need to put into the htaccess file because it appears that what I have placed in the robot.txt file is not working unless I have to wait longer.
|
|
|
|
Joined: Jun 2006
Posts: 16,364 Likes: 126
|
Joined: Jun 2006
Posts: 16,364 Likes: 126 |
Well, both methods display their .htaccess input code, the first would be (Blocking by user agent): # Block aggressive Chinese crawlers/scrapers/bots
# https://www.johnlarge.co.uk/blocking-aggressive-chinese-crawlers-scrapers-bots/
Options +FollowSymLinks
RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_USER_AGENT} ahrefsbot|Baiduspider|BLEXBot|Bytespider|BuckyOHare|dotbot|exabot|gigabot|Goodzer|gsa-crawler|Kinza|LieBaoFast|LinkFeatureBot|MauiBot|Mb2345Browser|MicroMessenger|mj12bot|musobot|rogerbot|rushBot|semrushbot|serpstatbot|sitebot|Sogou|SputnikBot|VelenPublicWebCrawler|WBSearchBot|WPSpider|zh-CN|zh_CN [NC]
RewriteRule ^ - [F,L] The second method (IP2Location Firewall List by Country) you'd take the .htaccess coding from their list for an Apache2 server and insert it as (the list is huge, I'm only including a couple of lines): <Limit GET HEAD POST>
order allow,deny
allow from all
deny from 1.0.1.0/24
deny from 1.0.2.0/23
deny from 1.0.8.0/21
deny from 1.0.32.0/19
deny from 1.1.0.0/24
deny from 1.1.2.0/23
deny from 1.1.4.0/22
deny from 1.1.8.0/21
deny from 1.1.16.0/20
deny from 1.1.32.0/19
...
deny from 223.252.168.0/21
deny from 223.252.177.0/24
deny from 223.252.178.0/23
deny from 223.252.180.0/22
deny from 223.252.184.0/21
deny from 223.252.192.0/18
deny from 223.254.0.0/16
deny from 223.255.0.0/17
deny from 223.255.236.0/22
deny from 223.255.252.0/23
</Limit>
|
|
|
3 members (Van, Ruben, 1 invisible),
781
guests, and
56
robots. |
Key:
Admin,
Global Mod,
Mod
|
|
|
|