I am making program which reads texture that should be applied to the mesh and generates some shapes which should be displayed on their triangles. I am converting points in a way that originally shape appears to be lying on XZ (in openGL way of marking axes, so Y is vertical, Z goes towards camera, X to the right). Now I have no idea how to properly measure angle between actual normal of traingle and vertical normal (I mean (0, 1, 0)) of image. I know, that it’s probably basic, but my mind refuses to cooperate on 3D graphics tasks recently.
Currently I use
angles.x = glm::orientedAngle(glm::vec2(normalOfTriangle.z, normalOfTriangle.y), glm::vec2(1.0f, 0.0f)); angles.y = glm::orientedAngle(glm::vec2(normalOfTriangle.x, normalOfTriangle.z), glm::vec2(1.0f, 0.0f)); angles.z = glm::orientedAngle(glm::vec2(normalOfTriangle.x, normalOfTriangle.y), glm::vec2(1.0f, 0.0f)); angles = angles + glm::vec3(-glm::half_pi<float>(), 0.0f, glm::half_pi<float>());
Which given my way of thinking should give proper results, but the faces of cube that should have normal parallel to Z axis appear to be unrotated in Z.
My logic bases on that I measure angle from each axis, and then rotate each axis by such angle for it to be vertical. But as I said, my mind glitches, and I cannot find proper way to do it. Can somebody please help?
Source: Windows Questions C++