Multiple log source to logstash

I am in my learning phase of ELK stack and have been following elastics documentation for the same. I have a couple of Linux servers that I want to monitor using ELK. I have set up two instances of an elastic node, one instance of kibana and one instance of logstash node. My Linux server has a web service (apache2) running and I want to monitor the apache access and error log and auth.log from the server. I have another Linux server running the same service which I want to monitor as well.

I have followed https://www.elastic.co/guide/en/logstash/current/advanced-pipeline.html and is able to feed apache access log into logstash and then into elastic search, however, is running into an issue in feeding apache access log, error log and auth log at once.

Below is my filbeat configuration file:

 - type: log 
   enabled: true  
   paths:    
    - /var/log/apache2/access.log
   fields:
   type: apachelog

 - type: log
   enabled: true
   paths:
    - /var/log/apache2/error.log
   fields:
   type: apachelog

 - type: log
   enabled: true
   paths:
    - /var/log/apache2/auth.log
   fields:
   type: syslog

   output.logstash:
   hosts: ["10.115.1.5:5044"]

Any my configuration file at conf.d directory is:

    input {
        beats {
            port => "5044"
        }
    }
    filter {
      if [type] == "apachelog" {
        grok {
          match => { "message" => "%{COMBINEDAPACHELOG}"}
        }
        geoip {
          source => "clientip"
        }
       }
      if [type] == "syslog" {
        grok {
          match => { "message" => "%{SYSLOGTIMESTAMP:system
     .auth.timestamp} %{SYSLOGHOST:system.auth.hostname}  
     sshd(?:\\[%{POSINT:system.auth.pid}\\])?:%{DATA:system.
     auth.ssh.even} %{DATA:system.auth.ssh.metho} for  
     (invalid user )? %{DATA:system.auth.user} from %
     {IPORHOST:system.auth.ip} port%{NUMBER:system
     .auth.port} ssh2(: %{GREEDYDATA:system.auth.ssh.signature})?" }          
        }
       }
     }
   output {
       if [type] == "apachelog" {
         elasticsearch {
            hosts => ["10.115.1.27", "10.115.1.47"]
            index => Linux1apachelog
             }
            stdout { codec => rubydebug }
          }
         else {
           elasticsearch {
              hosts => ["10.115.1.27", "10.115.1.48"]
              index => Linux1apachelog
               }
              stdout { codec => rubydebug }
            }
         }

My logstash is able to communicate both the elastic search nodes. Can you please help me towards correct direction?