cbimport not importing file which is extracted from cbq command

I tried to extract data from below cbq command which was successful.

cbq -u Administrator -p Administrator -e "http://localhost:8093" --script= SELECT * FROM `sample` where customer.id=="12345'" -q | jq '.results' > temp.json;

However when I am trying to import the same data in json format to target cluster using below command I am getting error.

cbimport json -c http://{target-cluster}:8091 -u Administrator -p Administrator -b sample -d file://C:\Users\{myusername}\Desktop\temp.json -f list -g %docId%

JSON import failed: 0 documents were imported, 0 documents failed to be imported
JSON import failed: input json is invalid: ReadArray: expect [ or , or ] or n, but found {, error found in #1 byte of ...|{
    "requ|..., bigger context ...|{
    "requestID": "2fc34542-4387-4643-8ae3-914e316|...],```

    ```{
    "requestID": "6ef38b8a-8e70-4c3d-b3b4-b73518a09c62",
    "signature": {
        "*": "*"
    },
    "results": [
    {
        "{Bucket-name}":{my-data}
    "status": "success",
    "metrics": {
        "elapsedTime": "4.517031ms",
        "executionTime": "4.365976ms",
        "resultCount": 1,
        "resultSize": 24926
    }

It looks like the file which was extracted from cbq command has control fields details like RequestID, metrics, status etc. Also json in pretty format. If I manually remove it(remove all fields except {my-data}) then put in a json file and make json unpretty then it works. But I want to automate it in a single run. Is there a way to do it in cbq command.

I don't find any other utility or way to use where condition on cbexport to do that on Couchbase, because the document which are exported using cbexport can be imported using cbimport easily.

How many English words
do you know?
Test your English vocabulary size, and measure
how many words do you know
Online Test
Powered by Examplum