Error and freezes in iOS Vision API "computeBlinkFunction" during feature detection

I use iOS Vision API to perform feature detection in realtime on ARFrame buffers passed from the camera on iPhone X. It usually works quite well, but on both iOS 11 and iOS 12, I've been seeing occasional errors in the console that I haven't been able to find any information on. This error prints out multiple times in a row, and is usually correlated to severe freezes in my app.

Has anyone seen this error before in their console, or have any idea what causes it? Any information or debugging tips would be greatly appreciated.

LandmarkDetector error -20:out of bounds in int vision::mod::LandmarkAttributes::computeBlinkFunction(const vImage_Buffer &, const Geometry2D_rect2D &, const std::vector<Geometry2D_point2D> &, vImage_Buffer &, vImage_Buffer &, std::vector<float> &, std::vector<float> &) @ /BuildRoot/Library/Caches/

Some more info: I think this error occurs during my VNDetectFaceLandmarksRequest. Currently I pass the face bounding box found by ARKit normalized into inputFaceObservations, but I also saw these errors when I used VNDetectFaceRectangleRequest to find the face bounding box. I am using a VNSequenceRequestHandler to make these requests in realtime on each ARFrame from the camera, running on a background thread.

1 answer

  • answered 2018-12-06 21:56 miles_b

    Something that was covered at this year's WWDC with Vision is that detection requests are much more resource-intensive than tracking requests. I believe the recommendation was that once you receive a detection, you should stop your detection requests and just use tracking instead. One thing that I've noticed in my own Vision code (I'm using it for text and barcodes) is that once you get an initial detection, you start getting a flood of observations, and it may be that Vision is outputting observations faster than the system can really handle if it goes on long enough.