Is there an algorithm that finds the average OF the outliers? Converting Lidar Data to Touch Points

139 Views Asked by At

I'm relatively new to programming, though I understand how it works at an intermediate level.

I have a LiDAR sensor (YDLiDAR-G4) that is set up on top of a wall, scanning everything across the wall. The goal is to have it so when someone touches the wall, the lidar detects the touch and then converts the polar data into x and y coordinates. The sensor has to be able to detect multiple touch points, for example one on the right side of the sensor and one on the left side. The amount of touch points should theoretically be infinite with a good enough sensor, but of course that's not possible so let's say there are a maximum of 10 touch points.

I have figured out pretty much all of the above, however because the sensor scans many different points, a single "touch point" will look like multiple points that are very close.

Is there a way to turn all of these points into one single "touch point?"

At first I tried running a nested for loop to see if a point is within a certain distance of another point, and if so it filters them out, but this is not super consistent and it doesn't work well.

In this code I have two vectors that hold a structure Point2D (cartesian coordinates), it checks if any points are within a certain distance of another and it only pushes one of them to the next list. List1 has all of the x and y coordinates that the lidar picks up, list 2 is empty.

static void filterPoints(const vector<Point2D>& list1, vector<Point2D>& list2, float distance) { //Modifying the two lists

    for (auto iter = list1.begin(); iter != list1.end(); iter++) {
        auto& inputPoint = *iter;
        bool merged = false;

        for (auto comparePointIter = list1.begin(); comparePointIter != list1.end(); comparePointIter++) {
            auto& comparePoint = *comparePointIter;

            if (pointDistance(inputPoint, comparePoint) < distance) {
                merged = true;
            }
            if (!merged) {
                list2.push_back(inputPoint);
            }
        }
    }
}

I already know this doesn't work because I'm pretty sure this only filters one point out of many, but I have no idea how else I can do this with this method.

So that gets to my question, is there an algorithm that takes the average of outliers only? For example if I have a thick finger and the sensor detects points at (.7,.7), (.72,.71) and (.75,.73), and also (.3,.3), (.29,.29) and (.28,.28) can I take the average of these and output them as individual points (.723,.713) and (.29,.29)?

0

There are 0 best solutions below