Inverted textures

Topics: Developer Forum
Feb 2, 2011 at 4:00 PM

I'm trying to draw textures aligned with physics body. I got the textures to be positioned correctly, but I noticed my textures are vertically inverted. So, Up pointing arrow texture, when rendered points down. I'm not sure where I am going wrong with the math.

My approach is to convert everything in Farseer's meter units and draw accordingly.

            Matrix proj = Matrix.CreateOrthographic(scale * graphics.GraphicsDevice.Viewport.AspectRatio, scale, 0, 1);
            Matrix view = Matrix.Identity;

            effect.World = Matrix.Identity;
            effect.View = view;
            effect.Projection = proj;

            effect.TextureEnabled = true;
            effect.VertexColorEnabled = true;

            SpriteBatch.Begin(SpriteSortMode.BackToFront, BlendState.AlphaBlend, null, DepthStencilState.Default, RasterizerState.CullNone, effect);



where Paddle::Draw looks like:

                new Vector2(16f, 16f),  // origin of the texture
                0.1875f, SpriteEffects.None,   // width of box is 3*2 = 6 meters. texture is 32 pixels wide. to make it 6 meters wide in world space: 6/32 = 0.1875f

The orthographic projection matrix seem fine to me, but I am obviously doing something wrong somewhere!

Feb 2, 2011 at 4:16 PM

You apply your matrix to SpriteBatch.Begin(...) so what happens internally is that you draw to the "standard" viewport with y-axis pointing down... (just a smaller one). SpriteBatch then takes the whole rendered scene and transforms it with your supplied matrix, flipping the y-axis in that case, like mirroring the complete screen. Thats why your objects end up upside down. Try to supply a scale matrix that flips the y-axis (so scale factor -1 for y) as your effect.World matrix. Alternatively just set SpriteEffect.FlipHorizontally in your Draw() call.

Feb 2, 2011 at 6:15 PM

Thanks for the explanation.

When I scale the effect's world matrix:

mEffect.World = Matrix.CreateScale(1f, -1f, 1f);

The movement of textures along Y is flipped. That is, when the physics object moves in the  up direction, the texture moves down.

However, in my Spritebatch.Draw(...) if I flipVertically (I think that is what you meant), the texture rotation goes for a toss. The rotated sprite now points in the opposite direction :-'(.

I am beginning to think my approach to solving this is wrong ? I try to put everything in Farseer's unit system and render textures using that position. Do you recommend against this approach ?


Feb 2, 2011 at 9:08 PM

One more thing, I was looking at the HelloWorldXNA sample. I see that +Y is assumed to be pointing downwards (South), and everything is authored in that coordinate system. Uptil now, I was assuming +Y in Up (North) direction, since am used to that convention from a physics background. I've thought about it much, but I'd like to know in your experience of using physics engines, what convention works out better ?

The way I see it is,

  • If we assume +Y as South: it aligns with the 2D graphics system and all one needs to do is make sure, that the same convention is followed when using the physics system. For e.g. setting the gravity to (0, +10) would then be in South direction.
  • If we assume +Y as North: It aligns with the conventional/real world physics coordinate system and one has to then account for this change in coordinate system at the time of rendering. (My current approach)

I'd be interested to learn what system works out better in practice and why.


Feb 3, 2011 at 12:09 PM
Edited Feb 3, 2011 at 12:34 PM
brainydexter wrote:

I'd be interested to learn what system works out better in practice and why.

Short answer: Whatever suits you most works out best in practice.

Long answer: It really doesn't matter at all. Farseer does not care about gravity and scales and stuff. It just operates within a two dimensional vector space and thus has no concept of up, down, left and right at all. It is tuned for certain ranges of object sizes but the rest is really up to you. If you want you could make your gravity point to (20f, 0f) and objects still fall "down" if you just rotate all positions and directions by 90° before drawing them.

With a custom projection matrix your axes can point in any direction you want and with a transformation matrix you map Farseer's vectors to these axes in any way you like.

In practice I'd therefore suggest keeping the transformations to a minimum for a hassle-free solution. I.e. SpriteBatch has the positive y-axis pointing down and its center in the top left edge of the screen. So just put Farseer's center in the top left and make the gravity something like (0f, 20f) if you want your objects to fall towards the bottom of the screen. The only thing you have to do then is some scaling between meters and pixels. But thats basic rule of three math in that case.

It really depends on your game though in the end. Think about something like Mario Galaxy in 2D and you have a game where up and down rarely correlate with any specific edge of your screen. In the end pick a coordinate system which feels right for you and stick to that.

In your case however keep in mind that SpriteBatch also treats rectangles as if they are defined by their top left corner and a width and a hight. So if you flip any of your axes, the top left corner of an rectangle becomes the bottom right one eventually. If you mix that up, your sprites start making headstands and all other kinds of crazy stuff ;)

Feb 4, 2011 at 5:12 AM

I appreciate your insight and time to answer that.

Since I had not much to do and out of curiosity, I went ahead and authored this 2d tower defense game assuming Spritebatch rendering system (+Y is South). Rendering was pretty straight-forward, but I realized that for a simple action such as a tap on the screen involved me going back and forth between physics and rendering coordinates so many times, that I think it won't be a good idea (unless I am doing something wrong).

On the other hand, if everything is in physics space, which just involved me to convert the tap position to physics space, there aren't a whole lot of transformations I need to do in my update loop. Its all about transforming the textures by the projection matrix then and yes flipping the textures vertically :)

I personally felt that using physics space as the standard is good for reducing a lot of function calls, and then manually tweaking the rendering sprites is a better idea. I'm thinking about this from a phone-dev perspective.

Anyways, thanks for your reply.