I am currently writing an application where I am doing image processing (w/Core Image) on a 2D image that includes the face (and the saved instance of ARSCNFaceGeometry
). I am having trouble and determined I am calculating the x,y point value to use in core image for the points that would corespond with those in ARFaceGeometry.verticies
.
I am capturing the 2D image by calling ARSCNView.snapshot()
and storing and doing processing on it as a CIImage
.
I am currently using that texture coordinates to try to calculate the x,y position on the CIImage
, but I havent had a ton of experience in using Core Image and couldnt figure out if this is the atribute I should be using.
Here is what I currently have to calculate the coordinates of a point in CIImage
x,y space. I'm trying to produce the CIVector
of the point. What am I doing wrong?
let imgAsCIImage = /* The CIImage of the ARSCNView Snapshot */
let faceDotPos = /* The index I am calculating point for */
let pointTexCoord = faceGeometry.textureCoordinates[faceDotPos]
let imageFrame = imgAsCIImage.extent
let xPoint = (CGFloat(pointTexCoord.x) * imageFrame.width)
let yPoint = (CGFloat(pointTexCoord.y) * imageFrame.height)
return CIVector(x: xPoint,y: yPoint)