Scaling of polygons (textureToPolygon)

Topics: Developer Forum, User Forum
Apr 22, 2011 at 10:21 PM

Hi

I'm working of porting my game Asteroid Lander to WP7 with Farseer 3.2 and I've found a small odd thing.

I want to be able to scale my polygons both in the way that they are drawn on screen and in the physics engine so in Farseer I do the following:

public static List<Fixture> ImageToPolygonBody(Texture2D texture, World world, float density,float scale_, ref Vector2 polygonOrigin)
        {
            List<Fixture> compundFixture;

            //Create an array to hold the data from the texture
            uint[] data = new uint[(texture.Width) * (texture.Height)];

            //Collect data from bitmap
            texture.GetData(data);

            //Create Polygon from Bitmap
            Vertices verts = PolygonTools.CreatePolygon(data, (texture.Width), (texture.Height), false);

            //Make sure that the origin of the texture is the centroid (real center of geometry)
            Vector2 scale = new Vector2(ConvertUnits.ToSimUnits(scale_), ConvertUnits.ToSimUnits(scale_));
            verts.Scale(ref scale);

            //Make sure that the origin of the texture is the centroid (real center of geometry)
            polygonOrigin = verts.GetCentroid();

            //Translate the polygon so that it aligns properly with centroid.
            Vector2 vertsTranslate = -polygonOrigin;
            verts.Translate(ref vertsTranslate);

            //We simplify the vertices found in the texture.
            //verts = SimplifyTools.ReduceByDistance(verts, 4f);

            //Decompose polygon into smaller chuncks that Farseer can process better
            List<Vertices> list;
            list = BayazitDecomposer.ConvexPartition(verts);

            //Create a single body with multiple fixtures
            compundFixture = FixtureFactory.CreateCompoundPolygon(world, list, density);

            return compundFixture;
        }

This works great if the scale is 1.0. If I use a scale of 2.0 the drawing is not correct because the texture is positioned to the left of the place where it actually should be.

This is my draw code:

spriteBatch.Draw(texture, ConvertUnits.ToDisplayUnits(compundFixture[0].Body.Position), null, Color.White, compundFixture[0].Body.Rotation, ConvertUnits.ToDisplayUnits(polygonOrgin), scale, SpriteEffects.None, 0f);

However if I divide the polygonOrgin by the scale the it works:

spriteBatch.Draw(texture, ConvertUnits.ToDisplayUnits(compundFixture[0].Body.Position), null, Color.White, compundFixture[0].Body.Rotation, ConvertUnits.ToDisplayUnits(polygonOrgin)/scale, scale, SpriteEffects.None, 0f);

Can anybody tell me either why I need to divide it or please let me know if something else is wrong with the code.

Since it works its not a major problem I just wonder why my ugly hack is needed.

Apr 24, 2011 at 9:52 AM

The origin of the sprite must be in pixels because its in texture space, while the origin of the polygon you're receiving is in sim units. You need to divide it by the scale to convert it back to pixels. Alternatively, calculate the centroid and translate the vertices first, and then perform the scaling. That way your polygonOrigin will remain in pixels.