This is a pipeline to process Symantec Endpoint Protection logs. It expects a syslog header. The format of the events it expects is:
<51>Dec 8 11:08:30 devhost SymantecServer: server02,Event Description: Portscan,Local Host IP: 192.168.50.2,Local Host MAC: 000000000000,Remote Host Name: ,Remote Host IP: 192.168.50.3,Remote Host MAC: 000000000000,Inbound,TCP,Intrusion ID: 0,Begin: 2020-12-08 09:08:01,End Time: 2020-12-08 09:08:01,Occurrences: 1,Application: ,Location: Office,User Name: johndoe,Domain Name: local,Local Port: 0,Remote Port: 0,CIDS Signature ID: 10000,CIDS Signature string: Portscan,CIDS Signature SubID: 0,Intrusion URL: ,Intrusion Payload URL: ,SHA-256: ,MD-5:
- Install the pipeline definition to Elasticsearch using Kibana Dev Tools Console or use
curl
.
ORPUT _ingest/pipeline/symantec-endpoint { /* JSON pipeline content */ }
curl -XPUT "https://es:9200/_ingest/pipeline/symantec-endpoint" -H 'Content-Type: application/json' -d@symantec-endpoint-pipeline.json
-
Add a log file input to the filebeat.yml. The
udp
input could be used alternatively.filebeat.inputs: - type: log paths: - /var/log/symantec-endpoint*.log tags: [symantec-endpoint, forwarded] pipeline: symantec-endpoint
-
Restart Filebeat.
@herrBez I started an issue at elastic/integrations#471 that mentions some of the issues you pointed out.
One problem with using
split
is that some values can contain commas. I've seen some events that had,"User Name: Doe,John",
for example. So I think a proper CSV decoder will be needed. I'm thinking to ask Elasticsearch to expose a CSV decode function in Painless that returns an array. Then we could run that array through the foreach/kv loop that you have.