I wants to virtually add foot ball and detect and tracking the foot so that we can simulate the kick to the ball.
Can any one please suggest the way achieve it in iOS?
I wants to virtually add foot ball and detect and tracking the foot so that we can simulate the kick to the ball.
Can any one please suggest the way achieve it in iOS?
There are several ready-made solutions for detecting parts of your body (including legs) provided that you are standing at full height in front of the camera (Apple ARKit can do this out of the box in the latest versions).
However, if you are looking for a solution to move your own leg - you are faced with a research and development phase.
Because the angle at which the user holds their device is not typical and the picture is quite limited in that only part of the leg and foot are visible.
You will have to:
The problem is that using native ARKit (the iOS way) for example, you can render the 3D model and even control it by control points, so it doesn't really matter. But the main problem with step 1 is finding the legs with all the control points at an atypical angle when you are scanning your own legs and not the entire person in front of you.
So you may need an ML engineer (possibly in Pyhton) to create the ML model first. Then use a native AR platform (such as ARKit for iOS) or Unity (if not native) to obtain and render the model relative to the detected legs.
I think you'll have to have your own CoreML model that identify where a foot is in an image, and pass it the frames captured by the camera and identify where there is a foot. ARKit doesn't do that and there's no foot recognition built in iOS 11