How to detect if an HitTest hits a given Mesh object?

78 Views Asked by At

I' m following the Geospatial example from Google and created the 3d model mesh at given coordinate successfully. Now I have implemented an hittest and get results but I don't' know how to detect if it hit the mesh or not

I have tried code like this without success

    if (motionEvent != null) {
        //val hitResultList = frame.hitTestInstantPlacement(motionEvent!!.x, motionEvent!!.y, 2.0f)
        val hitResultList = frame.hitTest(motionEvent)
        val debug = hitResultList.toString()
        if (hitResultList.size > 0) {
            Log.d(TAG, debug)

            for (hit in hitResultList){

                val trackable = hit.trackable
                Log.d(TAG, trackable.toString())


                if (hit.equals(virtualObjectMesh)){
                    Log.d(TAG, "tapped!")
                }
                if (hit.equals(earthAnchor)){
                    Log.d(TAG, "tapped!")
                }
            }
        }
        motionEvent = null
    }
1

There are 1 best solutions below

5
VonC On BEST ANSWER

From what I can see in the documentation ("ARCore / Perform hit-tests in your Android app"), to detect if a HitTest result hits a given Mesh object in ARCore or similar AR platforms, you would need to compare the Trackable obtained from the hit result with the instance of the Mesh or the object that you are interested in.

However, directly comparing HitResult objects with your Mesh using equals is not the correct approach, as HitResult and your Mesh object are different types of objects.
See "Comparing Objects in Java" from François Dupire and reviewed by Greg Martin

Instead, you should check the type of the Trackable associated with each HitResult to see if it is of the type that you expect (for example, a Plane or a Point), and then, if your Mesh object is attached to an Anchor or has some identifiable characteristics, you should use those to determine if the hit test has intersected with your Mesh.

if (motionEvent != null) {
    val hitResultList = frame.hitTest(motionEvent)
    if (!hitResultList.isEmpty()) {
        for (hit in hitResultList) {
            val trackable = hit.trackable

            // Assuming your mesh is associated with an Anchor
            if (trackable is Anchor && trackable == yourMeshAnchor) {
                Log.d(TAG, "Mesh tapped!")
            }

            // If you are looking for hits on specific types of trackables, like Planes
            if (trackable is Plane) {
                Log.d(TAG, "Plane tapped!")
                // Further checks can be done here to see if this Plane is related to your mesh
            }

            // If you have other conditions or objects to check, do so here
        }
    }
    motionEvent = null
}

Here, yourMeshAnchor should be the Anchor you used when placing your Mesh in the AR scene.

Here, yourMeshAnchor should be the Anchor you used when placing your Mesh in the AR scene.

So, the exact method to identify if a hit test intersects your specific Mesh depends on how your Mesh is integrated into the AR scene, and the AR platform's capabilities you are using. If your Mesh is a custom object not directly supported as a Trackable type in ARCore, you may need to implement additional logic to map between Trackable objects and your custom objects.


I have changed my code according to your suggestion, I get only Plane and Point trackable types, but I never get an Anchor type...
Btw, I have a terrain anchor with a custom obj model attached ... I'm using ARCore Geospatial.

In ARCore, when you perform a hit test, the API returns HitResult objects that reference Trackable types like Plane and Point. These represent real-world surfaces and points that ARCore has detected.
Anchors, however, are not directly returned as part of hit test results; instead, they are used to create stable positions in the world for placing objects.

When you have a custom OBJ model attached to a terrain anchor, the hit test will not directly indicate a hit on the "Anchor" itself but rather on the trackable surface (like a Plane) upon which the anchor and model are placed.

Considering your scenario with ARCore Geospatial, it is important to understand that Geospatial anchors are used to position content in real-world coordinates, but detecting interactions with the content (like a custom model) attached to these anchors requires to map hit test results to the anchored content.

In other words: the challenge is determining whether a hit test, initiated by a user's touch input, intersects with a custom object (e.g., a 3D model) that does not directly correspond to the types of Trackable objects (Plane, Point) that ARCore's hit tests usually detect.

The complete logic involves two main steps:

  1. Performing the Hit Test: Capturing the user's touch input and using it to perform a hit test, which returns a list of HitResult objects.
  2. Mapping Hit Test Results to Anchored Content: Determining whether any of these hit test results intersect with the location and bounds of your anchored 3D model.

That would involve:

  • Detect when the user touches the screen and obtain the MotionEvent.
  • Use this MotionEvent to perform a hit test against the AR scene, which returns a list of HitResult objects. These objects represent intersections with Trackable surfaces in the AR world.
  • For each hit result, extract the pose (position and orientation in space) of the hit.
  • For each hit pose, check if it falls within the spatial bounds of your anchored model. That involves comparing the hit pose to the model's anchor pose and the predefined bounds of the model.
    • Understand the spatial dimensions and bounds of your model relative to its anchor point. That could be a simple bounding box or a more complex shape that accurately encompasses your model.
    • Calculate if the hit pose is within these bounds, taking into account the anchor's pose as the reference point for your model's location in the AR world.
  • If a hit test result is determined to be within the bounds of your model, perform your desired interaction logic (e.g., highlighting the model, displaying information, etc.).
// Assuming you have access to `frame` from ARCore and a `motionEvent` from user input

// A function to check if a hit test point is within your model's bounds
fun isHitWithinModelBounds(hitPose: Pose, modelAnchorPose: Pose, modelBounds: ModelBounds): Boolean {
    // Implement logic here to determine if the hitPose falls within the modelBounds
    // relative to the modelAnchorPose
    return false // Example implementation; replace with actual logic
}

// Define the model's bounds relative to its anchor
val modelBounds = getModelBounds() // Assume this function defines the 3D model's bounds
val yourTerrainAnchorPose = yourTerrainAnchor.pose // Pose of your terrain anchor

if (motionEvent != null) {
    val hitResultList = frame.hitTest(motionEvent)
    for (hit in hitResultList) {
        val hitPose = hit.hitPose
        // Check if the hit is within the bounds of your model
        if (isHitWithinModelBounds(hitPose, yourTerrainAnchorPose, modelBounds)) {
            Log.d(TAG, "Custom model tapped!")
            // Perform your interaction logic here (e.g., highlight the model, show details, etc.)
        }
    }
    motionEvent = null // Reset the motion event to avoid processing it multiple times
}

getModelBounds() is a hypothetical function that you would implement based on the specific dimensions and shape of your 3D model.

isHitWithinModelBounds(hitPose, modelAnchorPose, modelBounds) checks whether a given hit test result falls within the defined bounds of your model, considering the model's anchor pose as the reference point. The loop through hitResultList evaluates each hit test result to determine if it interacts with your model.


Geospatial anchors are created from Earth that is a Trackable too, but doc at Earth says: "Earth does not support hit testing and will never be a result from Frame.hitTest(MotionEvent). Because Earth is a type of Trackable, the singleton Earth instance may also be returned by Session.getAllTrackables(Class) when enabled."

True: Earth class behaves differently from other Trackable types.
As noted, the Earth class, which represents the planet in ARCore's Geospatial API, does not support hit testing directly. That means you cannot directly hit test against the Earth itself to determine if a user's interaction (e.g., a tap) intersects with a Geospatial anchor or the associated virtual content.

That limitation necessitates a more manual approach to determining interactions with virtual objects placed via Geospatial anchors, relying on spatial mathematics rather than ARCore's built-in hit testing mechanisms for immediate results.
Since Earth is a global trackable that represents the entire globe and supports placing anchors with latitude, longitude, and altitude, it does not support direct hit testing through Frame.hitTest(MotionEvent). That is logical, as hit testing is generally used to find intersections with detected planes, points, or other local features in the user's immediate environment.
Geospatial anchors are tied to specific real-world coordinates, but are not directly interactable through the standard hit test mechanism. Instead, they are positioned relative to the Earth trackable.

Given this limitation, if you want to detect interactions with virtual objects placed at Geospatial anchor locations, you must implement a custom solution that involves:

  • determining the screen position of the geospatial anchor: convert the geospatial anchor's 3D world position to a 2D screen position. That can be done using ARCore's Camera methods to project the anchor's world space position to screen coordinates.

  • handling user input when the user interacts with the screen (e.g., tapping), compare the 2D screen coordinates of the tap with the 2D screen coordinates of your Geospatial anchors' virtual objects.

  • implementing spatial mathematics: If the distance between the tap coordinates and the virtual object's screen coordinates is within a certain threshold, you can consider the tap as interacting with that virtual object.

As a simplified example:

// Assuming you have a MotionEvent from the user's tap and a list of Geospatial anchors
val tapScreenX = motionEvent.x
val tapScreenY = motionEvent.y

// Use ARCore's Camera to project the anchor's world space position to screen coordinates
val camera = frame.camera
val anchorScreenPositions = geospatialAnchors.map { anchor ->
    val anchorPose = anchor.pose
    val worldPosition = floatArrayOf(anchorPose.tx(), anchorPose.ty(), anchorPose.tz())
    val screenPosition = FloatArray(2)
    camera.projectPoint(worldPosition, 0, screenPosition, 0)
    screenPosition
}

// Check if the tap is close to any of the anchor's screen positions
anchorScreenPositions.forEachIndexed { index, screenPosition ->
    val distance = Math.hypot((tapScreenX - screenPosition[0]).toDouble(), (tapScreenY - screenPosition[1]).toDouble())
    if (distance < YOUR_THRESHOLD) {
        Log.d(TAG, "Tapped near geospatial anchor at index $index")
        // Handle interaction with the virtual object here
    }
}

YOUR_THRESHOLD is a tolerance value you define based on how precise you want the tap detection to be relative to the virtual object's screen position.
That would be a workaround for the limitation noted in your observation, by leveraging the conversion of world space positions to screen space and then using basic distance calculations to infer interactions.