Using OpenGL to animate instanced objects

  animation, c++, glfw, glm-math, opengl

I want to understand how to create loads of similar 2-D objects and then animate each one separately, using OpenGL.

I have a feeling that it will be done using this and glfwGetTime().

Can anyone here help point me in the right direction?

Ok, so here is what is the general thing that have tried so far:

We have this vector that handles translations created the following code, which I have modified slightly to make a shift in location based on time.

glm::vec2 translations[100];
int index = 0;
float offset = 0.1f;
float time = glfwGetTime(); // newcode
for (int y = -10; y < 10; y += 2)
    for (int x = -10; x < 10; x += 2)

        glm::vec2 translation;
        translation.x = (float)x / 10.0f + offset + time; // new adjustment
        translation.y = (float)y / 10.0f + offset + time*time; // new adjustmet
        translations[index++] = translation;

Later, in the render loop,

while (!glfwWindowShouldClose(window))

    glClearColor(0.1f, 0.1f, 0.1f, 1.0f);

    glDrawArraysInstanced(GL_TRIANGLES, 0, 6, 100); // 100 triangles of 6 vertices each

    time = glfwGetTime(); // new adjustment


is what I have tried. I suppose I am misunderstanding the way the graphics pipeline works. As I mentioned earlier, my guess is that I need to use some glm matrices to make this work as I imagined it, but am not sure …

Source: Windows Questions C++