Mapping array elements in ElasticSearch to named fields?

I am trying to parse Flask logs for usage in ElasticSearch. I cannot change the format of the logs or add a ElasticSearch handler to the log module so I have to make do with what I have. Currently the log line looks like this.

2018-08-08 16:43:30,010 | INFO     |<secop.parse_t           >| Job "SM.sync_outbound (trigger: interval[0:01:00], next run at: 2018-08-08 16:44:29 PDT)" executed successfully

I need to map the first column as the TIMESTAMP, second column as LOGLEVEL, third column as FUNCTION, and fourth column as DATA. I tried to map it using the GROK processor but I couldn't figure out a way for it to handle dynamic whitespaces. What I did is setup a processor that splits it by the PIPEs and output it as an array. So on ES's side I now have data that went from

"message":"2018-08-08 16:43:30,010 | INFO     |<secop.parse_t           >| Job \"SM.sync_outbound (trigger: interval[0:01:00], next run at: 2018-08-08 16:44:29 PDT)\" executed successfully"

to

"split":["2018-08-08 16:43:30,010 "," INFO     ","<secop.parse_t          >"," Job \"SM.sync_outbound (trigger: interval[0:01:00], next run at: 2018-08-08 16:44:29 PDT)\" executed successfully"]

I need to now be able to map those array elements into named fields, but I'm not quite sure how to do that. I also need to be able to map a 5th column potentially for exception tracebacks, but also am pretty confused about how I would go about doing that. Does anyone know if this is possible?