🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

solving x*V1 + y*V2 = V3

Started by
14 comments, last by Max Power 5 years, 8 months ago

I know this is basically a 5th grade math problem, but I'm very rusty and I was hoping there is a way to solve this thing directly.

Background: I am working on a system that morphs clothing meshes based on the (dynamic) morphing/deformation of their underlying reference body mesh. The best I could come up with is to map every clothing vertex to a body triangle in advance and use this data to later shift the vertices with their triangles. My guess is that results will be smoothest, if I keep the relative position of the vertex on the triangle plane, which can be outside the triangle area, if there was no better match to be found. For this, I want to determine x and y, so I can later "restore" them. With triangle positions being t1, t2 and t3, V1 = t2 - t1, V2 = t3 - t1 and V3 = ClothingVertPos - t1 (projected onto the triangle plane).

Speed is not an issue, because this gets prepared ahead of time.

Advertisement

Just project V3 onto normalized V1 and do the same with V3 onto normalized V2 to get the scalars of the normalized vectors. It should be simple since you are able to project V3 onto the V1/V2 plane.

That said, I think your solver is going to be seriously wonky.

🙂🙂🙂🙂🙂<←The tone posse, ready for action.

Sounds like you want barycentric coordinates? https://www.scratchapixel.com/lessons/3d-basic-rendering/ray-tracing-rendering-a-triangle/barycentric-coordinates

Thanks for replies.

5 hours ago, fleabay said:

Just project V3 onto normalized V1 and do the same with V3 onto normalized V2 to get the scalars of the normalized vectors. It should be simple since you are able to project V3 onto the V1/V2 plane.

That said, I think your solver is going to be seriously wonky.

I did it like that. Should I project V3 to the triangle plane first, though? And when I use Vector1.ProjectOnTo(Vector2) (it's Unreal Engine), I get another vector. So I suppose I should take its size/length then?

And most importantly: what about the scalars? Since the vectors are not orthogonal, I think the resulting point will be off, if I just add them together with their vectors.

1 hour ago, JoeJ said:

I think that's overkill for what I'm trying to do. Besides, I need to be able to get coordinates outside the triangle as well.

So I was thinking, in order to keep the relative location of a single vertex to a given "reference" triangle that can dynamically have all its vertices moved, it should be easiest to just use 1 triangle vertex as the origin and describe the relative location by 2 scalars that are multiplied with the vectors that connect the other 2 vertices with the origin vertex, and then their sum should define the correct position for the "single vertex" at any time. The scalars can be greater 1 or smaller 0 without problems, if the vertex is not on the triangle. So I just need to compute the scalars from the original/initial relative vertex position towards the triangle. That's what I'm trying to do.

10 hours ago, Max Power said:

So I was thinking, in order to keep the relative location of a single vertex to a given "reference" triangle that can dynamically have all its vertices moved, it should be easiest to just use 1 triangle vertex as the origin and describe the relative location by 2 scalars that are multiplied with the vectors that connect the other 2 vertices with the origin vertex, and then their sum should define the correct position for the "single vertex" at any time. The scalars can be greater 1 or smaller 0 without problems, if the vertex is not on the triangle. So I just need to compute the scalars from the original/initial relative vertex position towards the triangle. That's what I'm trying to do.

That's exactly what barycentric coords do, give it a try. It's the simplest way. (Only orthogonal frames can be done more simple, but triangle edges are not orthogonal.)

Point can be outside triangle, and you need only two barycentric coords (the third is just 1 - (u+v)).

... i see my link is more about ray triangle intersection. So here is some code just about barycentric coords:


	inline vec ToBarycentricCoords3D (vec a, vec b, vec c, vec p)
    {
        vec v0 = b - a, 
            v1 = c - a, 
            v2 = p - a;
        float d00 = v0.Dot(v0);
        float d01 = v0.Dot(v1);
        float d11 = v1.Dot(v1);
        float d20 = v2.Dot(v0);
        float d21 = v2.Dot(v1);
        float denom = d00 * d11 - d01 * d01;
        vec bc;
        bc[1] = (d11 * d20 - d01 * d21) / denom;
        bc[2] = (d00 * d21 - d01 * d20) / denom;
        bc[0] = 1.0f - bc[1] - bc[2];
        return bc;
    }
	    inline vec FromBarycentricCoords (vec a, vec b, vec c, vec bc)
    {
        return bc[0] * (a - c) + bc[1] * (b - c) + c;
    }
	

Edit: Fixed a bug that turned up in following posts her as well.

Thanks a lot. My mesh is still pretty twisted, but that's probably my own errors...

One thing though: does "p" have to be on the triangle plane or can it be anywhere?

Anyone know what I'm doing wrong here?...

This is for preparing the "VertMorphInfo" after finding the best (basically closest atm...) triangle for it:


VertMorphInfo[v].Vert1 = BodyIndices[t * 3];
VertMorphInfo[v].Vert2 = BodyIndices[t * 3 + 1];
VertMorphInfo[v].Vert3 = BodyIndices[t * 3 + 2];

FVector Normal = FVector::CrossProduct(p1 - p2, p3 - p1).GetSafeNormal();
FVector Pos = FVector::VectorPlaneProject(Pos, Normal);// <-- also tried without projecting

FVector BaryCoord = ToBarycentricCoords3D(p1, p2, p3, Pos);
				
VertMorphInfo[v].Weight1 = BaryCoord.X;
VertMorphInfo[v].Weight2 = BaryCoord.Y;
VertMorphInfo[v].Height = (Pos - p1).ProjectOnTo(Normal).Size();

And this is to update the vertex position after the body shape was changed:


FVector Normal = FVector::CrossProduct(p1 - p2, p3 - p1).GetSafeNormal();

FVector BaryCoord(VertMorphInfo[v].Weight1, VertMorphInfo[v].Weight2, 0.f);

FVector V = FromBarycentricCoords(p1, p2, p3, BaryCoord) + Normal * VertMorphInfo[v].Height;

InstModel.StaticVertexBuffers.PositionVertexBuffer.VertexPosition(Offset + v) = V;

It doesn't look just wonky, it's completely off in some regions, slightly off in others.

 

The only other approach I have tried yet was just a placeholder and the simplest thing you can do. I just mapped every clothing mesh vertex to the closest body mesh vertex and kept the offset vector. But even that looks much smoother than what I've got now. There must be something I'm doing wrong.

This topic is closed to new replies.

Advertisement