How could I manage Graylog to parse my JSON logs correctly?

I have an rails app and I'm trying to configure logging to graylog. Pipeline consists of next steps: 1) Logs are written to file in JSON format by SemanticLogger gem. Log message consist of header info (first level tags) and payload with several levels of hierarchy:

{
  "tag": "mortgage",
  "app": "sneakers",
  "pid": 3448,
  "env": "production",
  "host": "thesaurus-mortgage",
  "thread": "91090300",
  "level": "info",
  "name": "Sneakers",
  "payload": {
    "class": "EgrnListenerWorker",
    "method": "work",
    "json": {
      "resource": "kontur",
      "action": "request_egrn_done",
      "system_code": "thesaurus",
      "id": 35883717,
      "project_id": "mortgage",
      "bank_id": "ab",
      "params": {
        "egrn": {
          "zip": "rosreestr/kontur/kontur_4288_2018-10-11_021848.zip",
          "pdf": "rosreestr/kontur/kontur_4288_2018-10-11_021848.pdf",
          "xml": "rosreestr/kontur/kontur_4288_2018-10-11_021848.xml"
        },
        "code": "SUCCESS"
      }
    },
    "valid_json": true
  },
  "created_at": "2018-10-11T17:44:58.262+00:00"
}

2) File is being read by Filebeat service and sent to Graylog.

And graylog could not parse correctly payload contents: payload As you can see - keys are concatenated with ":" in one string in such manner: key1=value1:key2=value2. This is not what I am expected. It would be perfect if I could manage graylog to parse contents of payload into different fields with names payload.key1, payload.key2 and so on (so I could perform search on these fields)

ps: my log data is heterogeneous, i.e. payload contents depend on functionality it was produced by, so I expect that there would be a huge amount of different fields of a kind "payload.xxxxx" - is it ok?