A complete Texture to Vertices example

Topics: Developer Forum
Feb 13, 2011 at 10:42 AM

I am trying to use the Textures to Vertices technique, and I found two different ways on how to do it...yet I haven't been able to successfully achieve it myself.

The first example I saw is in the Documentation and the second was from Demo1Screen from AdvancedSamples XNA WP7.

The methods are quite similar but but they vary slightly.  For example, the example from the Demo1Screen class iterates over the Vertices collection to scale and then calls ForceCounterClockWise on them and the example from the Documentation doesn't.


Is there an example I can follow which doesn't use Camera2D, DebugView or BasicEffect and includes drawing with SpriteBatch?

I am now trying implemented the example from the Documentation but I'm still having trouble making things work.  For example, when drawing with the sprite batch, do u need to scale the position and angle?  And why is the scale used when drawing on screen?

Thank you for your time,
//Dreas. 

Coordinator
Feb 13, 2011 at 12:23 PM

I'm sorry that the documentation is not updated. It takes a lot of time, and at the moment, I've decided to use my time on the engine instead. I will write the new documentation when FPE 3.3 comes out as the code samples will no longer apply.

The reason the scale is in there, is to flip the polygon on the Y axis and make it smaller. Force counter clockwise is needed because the engine expects the input to be counter clockwise, and since we just flipped the Y axis, the winding have changed.

It is not easy to understand at first. I'll try to describe it in steps:

1. An algorithm traces the outline of the texture (in pixels)
2. The output (in pixels) is identical to the texture. The upper left corner of the texture is (0,0). The engine needs (0,0) it to be the center of the polygon, so we translate (move) it by the centroid (center).
3. The center of the texture is still the upper left corner, but the polygon has a center at the centroid. So we save the centroid as the origin when we draw it.
4. We have to decompose the polygon (if it is concave) into several smaller convex polygons because the engine does only work with convex polygons.
5. We scale the polygon by a factor of our choice to convert it from pixels to meters. The Y-axis is negated to flip all polygons on the Y-axis.
6. We force all the polygons to be counter clockwise because we just flipped the Y-axis
7. We draw the texture on the screen, but we have to convert the position from meters to pixels. We also have to give it the scale and origin of the polygon.


The new HelloWorld project in the source control contains a sample that uses the Spritebatch. Demo1 from AdvancedSamples also uses the Spritebatch. What might confuse you is that a BasicEffect is used in conjunction with the Spritebatch, that is to automatically convert the position from meters to pixels.

Feb 13, 2011 at 2:27 PM
Edited Feb 13, 2011 at 2:27 PM

Genbox,
Thanks for your reply and I completely respect and understand your decision to work on the engine rather than the documentation at this point.


So, this means that Demo1 from AdvancedSamples is the way it should be done?  If so, I will now follow that sample.

My question now is, since I won't use a BasicEffect and Camera2D, what do I need to change from Demo1 to make it work without them?  i.e. to make it work with the ConvertUnits class.

For unit changing, I will be using the ConvertUnits class.

Coordinator
Feb 13, 2011 at 4:14 PM

You would need to do it just like the AdvancedSamples, but you will need to convert the body.Position value to pixels when drawing it.

Feb 13, 2011 at 8:20 PM
Edited Feb 13, 2011 at 8:31 PM

I am now trying to follow the Demo1 from AdvancedSamples but I still can't get it to fully work.  This is an example of a class I'm playing around with (most just from Demo1):

 

    class TextureVerticesEntity
    {
        private SpriteBatch spriteBatch;

        private Body _compund;
        private Vector2 _origin;
        private Texture2D _polygonTexture;
        private Vector2 _scale;
        private World world;

        public TextureVerticesEntity(Game game, SpriteBatch sb, World world, Vector2 pos, string image)
        {
            this.world = world;
            spriteBatch = sb;

            //load texture that will represent the physics body
            _polygonTexture = game.Content.Load<Texture2D>(image);

            //Create an array to hold the data from the texture
            uint[] data = new uint[_polygonTexture.Width * _polygonTexture.Height];

            //Transfer the texture data to the array
            _polygonTexture.GetData(data);

            //Find the vertices that makes up the outline of the shape in the texture
            Vertices textureVertices = PolygonTools.CreatePolygon(data, _polygonTexture.Width, false);

            //The tool return vertices as they were found in the texture.
            //We need to find the real center (centroid) of the vertices for 2 reasons:

            //1. To translate the vertices so the polygon is centered around the centroid.
            Vector2 centroid = -textureVertices.GetCentroid();
            textureVertices.Translate(ref centroid);

            //2. To draw the texture the correct place.
            _origin = -centroid;

            //We simplify the vertices found in the texture.
            textureVertices = SimplifyTools.ReduceByDistance(textureVertices, 4f);

            //Since it is a concave polygon, we need to partition it into several smaller convex polygons
            List<Vertices> list = BayazitDecomposer.ConvexPartition(textureVertices);

            //Now we need to scale the vertices (result is in pixels, we use meters)
            //At the same time we flip the y-axis.
            _scale = new Vector2(0.05f, -0.05f);

            foreach (Vertices vertices in list)
            {
                vertices.Scale(ref _scale);

                //When we flip the y-axis, the orientation can change.
                //We need to remember that FPE works with CCW polygons only.
                vertices.ForceCounterClockWise();
            }

            //Create a single body with multiple fixtures
            _compund = BodyFactory.CreateCompoundPolygon(world, list, 1f, ConvertUnits.ToSimUnits(pos));
            _compund.BodyType = BodyType.Dynamic;

        }

        public void Draw(GameTime gameTime)
        {
            spriteBatch.Draw(_polygonTexture, ConvertUnits.ToDisplayUnits(_compund.Position), null, Color.White, _compund.Rotation, _origin, 1f, SpriteEffects.None, 1f);
        }
    }

Can you spot anything that looks wrong in that code?  Is there some sort of discrepancy between _scale and ConvertUnits' conversions?

A minor difference there is in my code is that I'm drawing with 1f as the scale because if I use _scale, nothing appears.  Why is that?

Feb 14, 2011 at 8:13 PM

Finally I got it to work.

The convert units is 100 by default in the ConvertUnits class, so I changed this line:

_scale = new Vector2(0.05f, -0.05f);

to:

_scale = new Vector2(0.01, -0.01f);

Now everything's working like a charm.

 

Thanks again.

Feb 26, 2011 at 12:39 PM
Edited Feb 26, 2011 at 12:40 PM

There's still one part of the algorithm which I don't understand.

In step 5, why do you flip the y-axis?  

If I leave the code as _scale = new Vector2(0.05f, -0.05f);, all the fixtures are aligned up-side down, but if I set the y-scale to 0.05 they are aligned correctly.

Mar 4, 2011 at 6:01 PM

i think:

_scale = Vector2.One * (1f / Meter_in_Pixel);