I'm making a object detection app with React Native and Google Vision api. My app working very well when I send horizontal image. But if I send high resolution vertical image, api find objects like that image is horizontal. Also api work correctly if I send low resolotion vertical image.
The resolution of the first image is 6936x9248 (actually it's appears as 9248x6936 in Samsung gallery, I don't know why) and the resolution of the second image is 750x1000 (it's 750x1000 in gallery too).
Here the response of the Vision for both images.
For the first image: [{"mid":"/j/5qg9b8","name":"Packaged goods","score":0.5010282,"boundingPoly":{"normalizedVertices":[{"x":0.23222321,"y":0.45171654},{"x":0.7493964,"y":0.45171654},{"x":0.7493964,"y":0.5577906},{"x":0.23222321,"y":0.5577906}]}}]
For the secon image: [{"mid":"/j/5qg9b8","name":"Packaged goods","score":0.6785547,"boundingPoly":{"normalizedVertices":[{"x":0.43387607,"y":0.23372115},{"x":0.5493168,"y":0.23372115},{"x":0.5493168,"y":0.75314826},{"x":0.43387607,"y":0.75314826}]}},{"mid":"/m/01g317","name":"Person","score":0.53238875,"boundingPoly":{"normalizedVertices":[{"x":0.16410054,"y":0.65503424},{"x":0.6280196,"y":0.65503424},{"x":0.6280196,"y":0.9973958},{"x":0.16410054,"y":0.9973958}]}}]
As you can see the results are different. At least the position of the lightsaber had to be the same. I don't know why, there is an image orientation issue.
It's my request that I send to the Vision api:
await fetch(config.googleCloud.api + config.googleCloud.apiKey, {
method: 'POST',
body: JSON.stringify({
"requests": [
{
"image": {
"content": base64 of the image,
},
features: [
{ type: "OBJECT_LOCALIZATION", maxResults: 15 },
],
}
]
})
});
Is the problem at my request?