This project has moved and is read-only. For the latest updates, please go here.

Bug in PolygonizeTriangles function.

Topics: Developer Forum, User Forum
Aug 23, 2009 at 12:36 PM

Im not sure but i think there is a bug in the PolygonizeTriangles function, sometimes when i make a large object and tell it too autodivide it crashes during this function by over running the array. Ive tried to figure it out but its a little over my head i was wondering if genbox or matt may e able to take a quick look at it.

See code snippet below.

                            newP = null;
                    if (polyIndex < polysLength)
                        //If identical points are present, a triangle gets
                        //borked by the MergeParallelEdges function, hence
                        //the vertex number check
                        if (poly.nVertices >= 3) polys[polyIndex] = new Polygon(poly); ||||||||||||THIS IS THE LINE IT CRASHES ON|||||||||||
                        //else printf("Skipping corrupt poly\n");
                    if (poly.nVertices >= 3) polyIndex++; //Must be outside (polyIndex < polysLength) test
            return polyIndex;
Also have had arrays overrun in the TriangulatePolygon function.

Is there a maximum size for these functions?

Aug 23, 2009 at 4:18 PM

There are a lot of limitations to those functions. It will crash with some shapes, complex polygons, self-intersecting polygons and polygon soups. It is very hard to create a generic algorithm that can take all cases. The triangulate is a modified Earclip algorithm and it is only usable for simple polygons. Right now the current code will have to do until I get to look at little more at some alternatives. It is hard enough to create a fast and stable physics engine, but I also have to support all the user contributed tools.

3.0 will hopefully change this as we are rebuilding the whole thing. I will use algorithms found in mature libraries only.

Aug 24, 2009 at 1:18 AM

Thanks genbox. Thats fine i was just wondering if it was known. I think it works pritty well for almost all polygons, just a few cases.
I found that it returns -1 if there is a fail so i made it check if its going to overrun an array and then return -1.

Aug 24, 2009 at 6:48 AM

I think I had a similar issue decomposing geoms for SAT. I don't have the code on hand, but I think I tracked it down to be a fixed size temporary array. I changed it from 50 to 200 and my problem went away so I forgot about it. I'm not sure if it was in "PolygonizeTriangles", but I would check the size of the "polys" array... although based on the snippet you posted, it shouldn't be able to overrun the array.

Jan 13, 2010 at 7:05 PM

Yeah, that is correct. The maximum size of polygons in the original version was 50. I replaced the arrays with lists (they auto-resize) to get around that issue. I'm overhauling the convex decomposition code for FPE 3.0, hopefully I can increase reliability, performance and make them easier to use.

Jan 13, 2010 at 11:09 PM

Hey now! That code was pretty good, just a little finicky lol. I guess some of my prototyping mess made it through.

Jan 14, 2010 at 12:58 AM

It has nothing to do with your port Matt :) You did a great job of porting over the whole earclip library. The library itself has some problems that we can fix by small adjustments. The API is nicer now and it is consistent across multiple implementations (Bayazit and earclip).

There were also a limitation of 175 triangles in the earclip algorithm, that has also been removed. At some point I might make the implementation of the earclip more generic and experiment a little by swapping out one implementation with another. It is a 3-part system: Preparations, triangulation, triangle merging. They can be swapped out with other implementations to make it perform faster or support new features.