Fail2ban not blocking gitlab attacks

Hello,

I installed Gitlab on CentOS 8 and have it publicly accessible. I have installed failed2ban cant seem to get fail2ban to work with gitlab. Only the ssh daemon is blocking stuff. The http daemon is not blocking anything although I am getting attacks. I have set the Log files to what I believe is the correct ones, but still not capturing anything. Could someone help me figure out why fail2ban is not working with gitlab?

Below is my jail.local

[nginx-http-auth]
enabled  = true
filter   = nginx-http-auth
port     = http,https
logpath  = /var/log/gitlab/nginx/gitlab_error.log

[nginx-noscript]
enabled  = true
port     = http,https
filter   = nginx-noscript
logpath  = /var/log/gitlab/nginx/gitlab_access.log
maxretry = 6

[nginx-badbots]
enabled  = true
port     = http,https
filter   = nginx-badbots
logpath  = /var/log/gitlab/nginx/gitlab_access.log
maxretry = 2

[nginx-nohome]
enabled  = true
port     = http,https
filter   = nginx-nohome
logpath  = /var/log/gitlab/nginx/gitlab_access.log
maxretry = 2

[nginx-noproxy]
enabled  = true
port     = http,https
filter   = nginx-noproxy
logpath  = /var/log/gitlab/nginx/gitlab_access.log
maxretry = 2

[gitlab]
enabled = true
port = http,https
filter = gitlab
logpath = /var/log/gitlab/gitlab_error.log

Shouldn’t the last path be /var/log/gitlab/nginx/gitlab_error.log?

This is the only location I find one, it doesn’t exist in /var/log/gitlab. Perhaps this is the reason why. Also, you can debug fail2ban to see if the filters are working properly. For example, the gitlab filter in your fail2ban directory, we would need to see the regex in here, and then see if it matches entries in gitlab_error.log or maybe even gitlab_access.log is what it should be pointing too, depending on where the failed entries are being logged.

Hello,
I have adjusted the logs to point to the error logs and nothing. This is one of the configs from the filter.d

# Fail2Ban configuration file
#
# Regexp to catch known spambots and software alike. Please verify
# that it is your intent to block IPs which were driven by
# above mentioned bots.


[Definition]

badbotscustom = EmailCollector|WebEMailExtrac|TrackBack/1\.02|sogou music spider|(?:Mozilla/\d+\.\d+ )?Jorgee
badbots = Atomic_Email_Hunter/4\.0|atSpider/1\.0|autoemailspider|bwh3_user_agent|China Local Browse 2\.6|ContactBot/0\.2|ContentSmartz|DataCha0s/2\.0|DBrowse 1\.4b|DBrowse 1\.4d|Demo Bot DOT 16b|Demo Bot Z 16b|DSurf15a 01|DSurf15a 71|DSurf15a 81|DSurf15a VA|EBrowse 1\.4b|Educate Search VxB|EmailSiphon|EmailSpider|EmailWolf 1\.00|ESurf15a 15|ExtractorPro|Franklin Locator 1\.8|FSurf15a 01|Full Web Bot 0416B|Full Web Bot 0516B|Full Web Bot 2816B|Guestbook Auto Submitter|Industry Program 1\.0\.x|ISC Systems iRc Search 2\.1|IUPUI Research Bot v 1\.9a|LARBIN-EXPERIMENTAL \(efp@gmx\.net\)|LetsCrawl\.com/1\.0 \+http\://letscrawl\.com/|Lincoln State Web Browser|LMQueueBot/0\.2|LWP\:\:Simple/5\.803|Mac Finder 1\.0\.xx|MFC Foundation Class Library 4\.0|Microsoft URL Control - 6\.00\.8xxx|Missauga Locate 1\.0\.0|Missigua Locator 1\.9|Missouri College Browse|Mizzu Labs 2\.2|Mo College 1\.9|MVAClient|Mozilla/2\.0 \(compatible; NEWT ActiveX; Win32\)|Mozilla/3\.0 \(compatible; Indy Library\)|Mozilla/3\.0 \(compatible; scan4mail \(advanced version\) http\://www\.peterspages\.net/?scan4mail\)|Mozilla/4\.0 \(compatible; Advanced Email Extractor v2\.xx\)|Mozilla/4\.0 \(compatible; Iplexx Spider/1\.0 http\://www\.iplexx\.at\)|Mozilla/4\.0 \(compatible; MSIE 5\.0; Windows NT; DigExt; DTS Agent|Mozilla/4\.0 efp@gmx\.net|Mozilla/5\.0 \(Version\: xxxx Type\:xx\)|NameOfAgent \(CMS Spider\)|NASA Search 1\.0|Nsauditor/1\.x|PBrowse 1\.4b|PEval 1\.4b|Poirot|Port Huron Labs|Production Bot 0116B|Production Bot 2016B|Production Bot DOT 3016B|Program Shareware 1\.0\.2|PSurf15a 11|PSurf15a 51|PSurf15a VA|psycheclone|RSurf15a 41|RSurf15a 51|RSurf15a 81|searchbot admin@google\.com|ShablastBot 1\.0|snap\.com beta crawler v0|Snapbot/1\.0|Snapbot/1\.0 \(Snap Shots, \+http\://www\.snap\.com\)|sogou develop spider|Sogou Orion spider/3\.0\(\+http\://www\.sogou\.com/docs/help/webmasters\.htm#07\)|sogou spider|Sogou web spider/3\.0\(\+http\://www\.sogou\.com/docs/help/webmasters\.htm#07\)|sohu agent|SSurf15a 11 |TSurf15a 11|Under the Rainbow 2\.2|User-Agent\: Mozilla/4\.0 \(compatible; MSIE 6\.0; Windows NT 5\.1\)|VadixBot|WebVulnCrawl\.unknown/1\.0 libwww-perl/5\.803|Wells Search II|WEP Search 00

failregex = ^<HOST> -.*"(GET|POST|HEAD).*HTTP.*"(?:%(badbots)s|%(badbotscustom)s)"$

ignoreregex =

datepattern = ^[^\[]*\[({DATE})
              {^LN-BEG}

# DEV Notes:
# List of bad bots fetched from http://www.user-agents.org
# Generated on Thu Nov  7 14:23:35 PST 2013 by files/gen_badbots.
#
# Author: Yaroslav Halchenko

OK, so fail2ban will work ONLY if it finds entries in the log files that match the regex in the filter. If they don’t match then it won’t block.

Are you sure the log files have entries that you know are problems? If so please post these, then we can check it.

You can do something like this:

fail2ban-regex /var/log/auth.log /etc/fail2ban/filter.d/sshd.conf

of course, replace the log file above with the path to your gitlab logfile, and for the filter name, replace it with the gitlab.conf or whatever name you gave the file. That filename should also match in the jail.local (just without the .conf bit).

If that command fails to find entries it means that there are no matching log entries. If you see log entries, then you need to fix the regex in the filter to make sure it matches your logs. I can help you with this, but I need to see log entries that you know are attacks.

I have done the fail2ban-regex against the gitlab_access.log file and all my filters come back empty, but when I do a tcpdump on port 443 and 80, I do see some attempts hit the server and it is being tracked on gitlab_access.log file. How can I fix the filters to capture the stuff correctly. How can I share my log file with you?

Thanks,

Well I tried that filter at my end and it didn’t work. I even added some extra stuff to get it to find Baiduspider which was shown in my long entry. Here it is:

x.x.x.x - - [08/Dec/2020:20:19:10 +0100] "GET / HTTP/1.1" 301 162 "" "Baiduspider+(+http://www.baidu.com/search/spider.htm);googlebot|baiduspider|baidu|spider|sogou|bingbot|bot|yahoo|soso|sosospider|360spider|youdaobot|jikeSpider;)" -

Look at the below in what I did to simplify the regex:

#failregex = ^<HOST> -.*"(GET|POST|HEAD).*HTTP.*"(?:%(badbots)s|%(badbotscustom)s)"$
failregex = ^<HOST> -.*"(GET|POST|HEAD).*HTTP.*" "Baiduspider.*"

That will obviously only match baiduspider and nothing else, but it shows how you can edit the regex to make it work. The problem you have is like I said, one of two problems. Either no log entries match the regex, or there are no entries on your logs related to the regex being used. This is why you get zero results.

Using the fail2ban-regex command, results:

Lines: 11648 lines, 0 ignored, 1 matched, 11647 missed

which is correct, as I only had one baiduspider in my logs, so 1 match.

Now, taking that further, I’ve modified the regex from the gitlab.conf, to now include this:

failregex = ^<HOST> -.*"(GET|POST|HEAD).*HTTP.*"(?:%(badbots)s|%(badbotscustom)s).*"

now you can see it’s using badbots and badbotscustom which are set at the beginning of the file. My badbotscustom looks like this:

badbotscustom = Baiduspider|EmailCollector|WebEMailExtrac|TrackBack/1\.02|sogou music spider|(?:Mozilla/\d+\.\d+ )?Jorgee

You can see I added Baiduspider to match my logs. And now, using that regex:

Lines: 11648 lines, 0 ignored, 1 matched, 11647 missed

I get 1 match, which is correct.