Sending F5 ASM (WAF) Logs to ELK Stack

Hello;

Last week i read https://devcentral.f5.com/s/articles/Implementing-BIG-IP-WAF-logging-and-visibility-with-ELK article on the F5 dev central community and i decided to implement it on my lab environment.

In this article i will install ELK (Elasticsearch , Logstash, Kibana) version 6 on the Ubuntu Server 16.04 which have ip adress 172.16.160.128, then i will forward the logs  of F5 ASM (Application Security Manager) which have managment ip address is 172.16.160.129 on the ELK stack, in this manner we will visualise the http attack requests   on the ELK stack.

On the ubuntu server i will install and configure ELK stack and on the F5 ASM i will create;

  • node
  • pool
  • pool member
  • virtual server
  • security profile with rapid deployment policy to prevent attack signature immediately
  • Logging profile to forward ELK stack

Here is my simple topology;

I used Damn Vulnerable Web Application as an vulnerable web application, and i used F5 Big IP 15.1.0-0.0.31 version as web application firewall.

Installing ELK Stack on Ubuntu Server

Check version of Ubuntu Server :

anaytemiz@ubuntu:~$ cat /etc/issue.net 
Ubuntu 16.04.3 LTS

 

 

 

Check ip adress of ubuntu server :

anaytemiz@ubuntu:~$ ifconfig 
ens33     Link encap:Ethernet  HWaddr 00:0c:29:68:75:12  
          inet addr:172.16.160.128  

Install Java Environmet

anaytemiz@ubuntu:~$ sudo apt-get install openjdk-8-jre-headless -y

 

Add ELK Stack 6 Repository

anaytemiz@ubuntu:~$ sudo wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
OK
anaytemiz@ubuntu:~$ echo "deb https://artifacts.elastic.co/packages/6.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-6.x.list
deb https://artifacts.elastic.co/packages/6.x/apt stable main
anaytemiz@ubuntu:~$ 

Update Repository

anaytemiz@ubuntu:~$ sudo apt-get update

Install , Configure and Start Elasticsearch

anaytemiz@ubuntu:~$ sudo apt-get install elasticsearch

After seccusfully installed elasticsearch ,change network.host part to 0.0.0.0 in the elasticsearch.yml configuration file under the /etc/elasticsearch direcktory and delete the comment symbol.

anaytemiz@ubuntu:~$ sudo vim /etc/elasticsearch/elasticsearch.yml

Now we can start elasticsearch services

anaytemiz@ubuntu:~$ sudo service elasticsearch start
anaytemiz@ubuntu:~$ sudo systemctl enable elasticsearch.service

Installing Logstash and Start it Services

anaytemiz@ubuntu:~$ sudo apt install logstash

Now we can start logstash services

anaytemiz@ubuntu:~$ sudo systemctl start logstash.service
anaytemiz@ubuntu:~$ sudo systemctl enable logstash.service

Installing , Configuring and Starting Kibana

anaytemiz@ubuntu:~$ sudo apt install kibana

After succesfully installed kibana, change server.host part to “172.16.160.128” and elasticsearch.hosts part to “http://127.0.0.1:9200” kibana.yml file under the /etc/kibana directory.

anaytemiz@ubuntu:~$ sudo vim /etc/kibana/kibana.yml

Now can start kibana services

anaytemiz@ubuntu:~$ sudo service kibana start
anaytemiz@ubuntu:~$ sudo systemctl enable kibana.service

Configuring F5 ASM

Configure Node, Pool, Pool Member and Virtaul Server

According to our topology, node ip address will be 10.1.20.17, pool member will be 10.1.20.17:80 and virtıual server will ve 10.1.10.35, now configure that on the ASM.

Node Config :

Pool Config :

VS Config

Now we can create Security Policy Profile then assign it virtaul server which we create, i will use rapid deployment policy to prevernt signature based attacks immediately. We can create security policy under Security >> Application Security >> Security Polices >> Create

Create Logging Profile to send logs ELK Stack

We can create log profile under Security >> Event Logs >> Logging Profiles >> Create to send logs to the ELK stack

To active sending logs ELK stack, we have to assign this logging profile to the virtual server.

Create Logstash conf File

Now F5 started to sending web reqesuts logs to the logstash but there is no conf file under the /etc/logstash/conf.d directory. We have to create .conf file to read logs and to send it elasticsearch. There is a sample .conf file, in the F5 community article which i mentioned in the begining of this article, i will implement it here, then restart logstash services.

anaytemiz@ubuntu:~$ sudo vim /etc/logstash/conf.d/f5waf.conf

 

 input {
 syslog {
   port => 5224
 }
}
filter {
 grok {
   match => {
     "message" => [
       "attack_type=\"%{DATA:attack_type}\"",
       ",blocking_exception_reason=\"%{DATA:blocking_exception_reason}\"",
       ",date_time=\"%{DATA:date_time}\"",
       ",dest_port=\"%{DATA:dest_port}\"",
       ",ip_client=\"%{DATA:ip_client}\"",
       ",is_truncated=\"%{DATA:is_truncated}\"",
       ",method=\"%{DATA:method}\"",
       ",policy_name=\"%{DATA:policy_name}\"",
       ",protocol=\"%{DATA:protocol}\"",
       ",request_status=\"%{DATA:request_status}\"",
       ",response_code=\"%{DATA:response_code}\"",
       ",severity=\"%{DATA:severity}\"",
       ",sig_cves=\"%{DATA:sig_cves}\"",
       ",sig_ids=\"%{DATA:sig_ids}\"",
       ",sig_names=\"%{DATA:sig_names}\"",
       ",sig_set_names=\"%{DATA:sig_set_names}\"",
       ",src_port=\"%{DATA:src_port}\"",
       ",sub_violations=\"%{DATA:sub_violations}\"",
       ",support_id=\"%{DATA:support_id}\"",
       "unit_hostname=\"%{DATA:unit_hostname}\"",
       ",uri=\"%{DATA:uri}\"",
       ",violation_rating=\"%{DATA:violation_rating}\"",
       ",vs_name=\"%{DATA:vs_name}\"",
       ",x_forwarded_for_header_value=\"%{DATA:x_forwarded_for_header_value}\"",
       ",outcome=\"%{DATA:outcome}\"",
       ",outcome_reason=\"%{DATA:outcome_reason}\"",
       ",violations=\"%{DATA:violations}\"",
       ",violation_details=\"%{DATA:violation_details}\"",
       ",request=\"%{DATA:request}\""
     ]
   }
   break_on_match => false
 }
 mutate {
   split => { "attack_type" => "," }
   split => { "sig_ids" => "," }
   split => { "sig_names" => "," }
   split => { "sig_cves" => "," }
   split => { "staged_sig_ids" => "," }
   split => { "staged_sig_names" => "," }
   split => { "staged_sig_cves" => "," }
   split => { "sig_set_names" => "," }
   split => { "threat_campaign_names" => "," }
   split => { "staged_threat_campaign_names" => "," }
   split => { "violations" => "," }
   split => { "sub_violations" => "," }
 }
 if [x_forwarded_for_header_value] != "N/A" {
   mutate { add_field => { "source_host" => "%{x_forwarded_for_header_value}"}}
 } else {
   mutate { add_field => { "source_host" => "%{ip_client}"}}
 }
 geoip {
   source => "source_host"
 }
}
output {
 elasticsearch {
   hosts => ['127.0.0.1:9200']
   index => "big_ip-waf-logs-%{+YYY.MM.dd}"
 }
}
anaytemiz@ubuntu:~$ sudo service logstash stop
anaytemiz@ubuntu:~$ sudo service logstash start

Viewing Waf Logs on the ELK

Now we can make fake request to creation log on the ELK. I will do simple sql injection attack to the vulnerably app, the review that attack logs on the ELK stack.

İlk yorum yapan olun

Bir yanıt bırakın

E-posta hesabınız yayımlanmayacak.


*