I'm trying to offset a cubic bezier curve, by doing the following
- find the x and y derivative to get the tangent vector (dx, dy)
- rotate the vector 90deg by (-dy, dx) to get the normal
- making it a unit vector by dividing by the magnitude
- multiplying by the desired offset, and adding it the the point on the curve
def cubic_offset(P0, P1, P2, P3, t, dist):
initx, inity = cubic(P0, P1, P2, P3, t)
dx, dy = cubic_dt(P0, P1, P2, P3, t) # this is the tangent vector
normx, normy = -dy, dx
mag = sqrt(normx**2 + normy**2)
normx, normy = normx/mag, normy/mag
return initx + dist*normx, inity + dist*normy
however, this isnt working, esspecialy near extrema
I also saw this blog post https://observablehq.com/@s-silva/bezier-curve-offsets that confirms this process actually works, and provides a preview that doesnt produce errors.
Our code matches almost perfectly, however his works and mine doesnt.
my derivative is also correct as Im using it to get the extrema and the bounding box which is working perfectly fine.

Solved:
Because the offset was only wrong around extrema, where the derivative is close to 0, I figured it could have something to do with floating-point arethmatic.
I copied the derivative thats on Wikipedia
and it worked perfectly fine, even though it is mathematically equivalent to my derivative
Apparently, when calculated by Python, the average difference between them for t between 0 and 1 and P's between 0 and 700 is 0.049 (Calculated 2,000,000 times) and for t between 0 and 100 the average difference is a staggering 43.2.
Any corrections to how this error is produced would be most appreciated, as I'm not that knowledgeable in such things.
EDIT:
Turns out it's not floating-points to blame, but tuples! or maybe something else I dont know, any help would be appreciated.
Results: