Vertex Pipeline Metric values not being added to metrics artifact?

We are trying to return some metrics from our Vertex Pipeline, such that they are visible in the Run Comparison and Metadata tools in the Vertex UI.

I saw here that we can use this output type Output[Metrics], and the subsequent metrics.log_metric("metric_name", metric_val) method to add the metrics, and it seemed from the available documentation that this would be enough.

We want to use the reusable component method as opposed to python function based components, around which the example is based. So we implemented it within our component code like so:

We added the output in the component.yaml:

    - name: metrics
      type: Metrics
      description: evaluation metrics path

then added the output to the command in the implemenation:

        command: [
            --gcs-test-data-path,       {inputValue: gcs_test_data_path},
            --gcs-model-path,  {inputValue: gcs_model_path},
            --gcs-output-bucket-id,  {inputValue: gcs_output_bucket_id},
            --project-id, {inputValue: project_id},
            --timestamp, {inputValue: timestamp},
            --batch-size, {inputValue: batch_size},
            --img-height, {inputValue: img_height},
            --img-width,  {inputValue: img_width},
            --img-depth,  {inputValue: img_depth},
            --metrics,  {outputPath: metrics},

Next in the components main python script, we parse this argument with argparse:

                    help='evaluation metrics output')

and pass it to the components main function:

if __name__ == '__main__':
    ARGS = PARSER.parse_args()

in the declaration of the component function, we then typed this metrics parameter as Output[Metrics]

from kfp.v2.dsl import Output, Metrics

def evaluation(gcs_test_data_path: str,
               gcs_model_path: str,
               gcs_output_bucket_id: str,
               metrics: Output[Metrics],
               project_id: str,
               timestamp: str,
               batch_size: int,
               img_height: int,
               img_width: int,
               img_depth: int):

finally, we implement the log_metric method within this evaluation function:

    metrics.log_metric('accuracy', acc)
    metrics.log_metric('precision', prec)
    metrics.log_metric('recall', recall)
    metrics.log_metric('f1-score', f_1)

When we run this pipeline, we can see this metric artifact materialised in the DAG:

Metrics artifact visible in the DAG

And Metrics Artifacts are listed in the Metadata UI in Vertex:

Metrics Artifacts are listed in the metadata UI

However, clicking through to view the artifacts JSON, there is no Metadata listed:

No Metadata attached to artifact

In addition, No Metadata is visible when comparing runs in the pipeline UI:

No Metadata in the pipeline UI

Finally, navigating to the Objects URI in GCS, we are met with 'Requested entity was not found.', which I assume indicates that nothing was written to GCS:

No Object in GCS

Are we doing something wrong with this implementation of metrics in the reusable components? From what I can tell, this all seems right to me, but it's hard to tell given the docs at this point seem to focus primarily on examples with Python Function based components.

Do we perhaps need to proactively write this Metrics object to an OutputPath?

Any helps is appreciated.

How many English words
do you know?
Test your English vocabulary size, and measure
how many words do you know
Online Test
Powered by Examplum