[varLib] "Fix" cython iup issue?

In some cases we were seeing different output from iup depending on
whether or not we were running cython code.

I've tracked this particular issue down to the line that is changed in
this diff, and the change introduced in this diff does (locally, for me,
on one machine with one architecture and one compiler) seem to suppress
the problem.

However... it feels pretty bad??

I'm not sure how motivated I am to try and generate a proper minimal
test case and try to get this fixed upstream. I guess I'm.. medium
motivated? But at the very least it would be nice to figure out a more
robust way to prevent this optimization from happening, and at the very
_very_ least it would be nice to figure out away to test this.

The solution I was hoping for was some way to write some actual
hand-written C so we could have finer-grained control over what's going
on, and use that just for this one little bit of arithmetic, but I
didn't see an easy way to do that.
This commit is contained in:
Colin Rofls 2024-11-27 19:28:19 -05:00 committed by Behdad Esfahbod
parent e037cea726
commit d6f3c51895

View File

@ -74,7 +74,14 @@ def iup_segment(
d = d2
else:
# Interpolate
d = d1 + (x - x1) * scale
#
# NOTE: we assign an explicit intermediate variable here in
# order to disable a fused mul-add optimization. See:
#
# - https://godbolt.org/z/YsP4T3TqK,
# - https://github.com/fonttools/fonttools/issues/3703
nudge = (x - x1) * scale
d = d1 + nudge
out.append(d)