Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OrbitControl() Compatibility with imageLight() by fixing camera. #6735

Merged
merged 5 commits into from
Jan 19, 2024

Conversation

perminder-17
Copy link
Contributor

@perminder-17 perminder-17 commented Jan 13, 2024

Resolves #6688

Changes:

Screenshots of the change:

orbitCont.mp4

Sketch for testing @davepagurek

https://editor.p5js.org/aman12345/sketches/_QiQ5omfK

PR Checklist

Copy link
Contributor

@davepagurek davepagurek left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for taking this on, this is a great start! I left a few comments about some of the math and also a potential bug in some existing code that you haven't changed that is now revealed.

@@ -343,6 +343,9 @@ p5.Shader = class {
this._renderer.uNMatrix.inverseTranspose(this._renderer.uMVMatrix);
this.setUniform('uNormalMatrix', this._renderer.uNMatrix.mat3);
}
const cameraMat = this._renderer._curCamera.cameraMatrix;
const camMat3x3 = cameraMat.createSubMatrix3x3();
this.setUniform('uCameraMatrix', camMat3x3.mat3);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we can do something similar to what we do for the normal matrix above by:

  • putting this block in an if (this.uniforms.uCameraMatrix) { ... } to avoid doing extra work for shaders that don't need it
  • store a matrix on the renderer for this, similar to this._renderer.uNMatrix, to reuse resources

@@ -343,6 +343,9 @@ p5.Shader = class {
this._renderer.uNMatrix.inverseTranspose(this._renderer.uMVMatrix);
this.setUniform('uNormalMatrix', this._renderer.uNMatrix.mat3);
}
const cameraMat = this._renderer._curCamera.cameraMatrix;
const camMat3x3 = cameraMat.createSubMatrix3x3();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Typically for matrices used to update normals, we use the inverse transpose of the transform applied to the vertices, like we do above for the normal matrix. Does it work if we do the same here? (if both seem to work, it would be interesting logging both values to see if they differ.)

Assuming we do the inverse transpose, we maybe should rename the uniform to something like uCameraRotation to make it clear that we're not intending for it to be used to apply the camera to vertex positions; only for rotating normals.

vec3 worldNormal = normalize(vNormal);
vec3 lightDirection = normalize( vViewPosition - worldCameraPosition );
vec3 worldNormal = normalize(vNormal * uCameraMatrix);
vec3 lightDirection = normalize( vViewPosition * uCameraMatrix - worldCameraPosition );
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The type of transform you do for normals and for positions are generally different (normals must always be perpendicular to the surface, while positions can squish; see the diagram in this SO answer for an idea of why we need something different for normals.) So generally you won't find yourself multiplying both positions and normals by the same matrix.

In this case, I think we only need to do it to the normal, and probably only after doing the reflection:

vec3 worldNormal = normalize(vNormal);
vec3 lightDirection = normalize( vViewPosition - worldCameraPosition );
vec3 R = reflect(lightDirection, worldNormal) * uCameraMatrix;

The idea is that, without modifying the normal or position here, we would be looking up the texture correctly if the sphere map that surrounds the scene is un-rotated. To account for its rotation, only at the end would we apply the rotation matrix to it.

vec3 worldNormal = normalize(vNormal);
vec3 lightDirection = normalize( vViewPosition - worldCameraPosition );
vec3 worldNormal = normalize(vNormal * uCameraMatrix);
vec3 lightDirection = normalize( vViewPosition * uCameraMatrix - worldCameraPosition );
vec3 R = reflect(lightDirection, worldNormal);
vec2 newTexCoor = mapTextureToNormal( R );
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I noticed something kinda interesting. When you rotate around 360 degrees, you can always see the sun in the reflection, even though presumably at some point it should be behind the sphere and not in the reflection:

Screen.Recording.2024-01-14.at.10.58.29.AM.mov

I wonder if this is a bug in mapTextureToNormal that we didn't notice before because we couldn't rotate the spheremap texture like we can now. Is that something you're interested in looking into as part of this change? The idea is to convert a normal into a spherical coordinate (an azimuthal angle and a polar angle, and a fixed radius of 1), and then map those angles to 2D texture coordinates on the input image. Maybe we're doing that slightly incorrectly?

@perminder-17
Copy link
Contributor Author

perminder-17 commented Jan 15, 2024

When you rotate around 360 degrees, you can always see the sun in the reflection, even though presumably at some point it should be behind the sphere and not in the reflection

Good catch @davepagurek :')

Does it work if we do the same here? (if both seem to work, it would be interesting logging both values to see if they differ.)

I have commited the required changes you mentioned above and it's working exactly the same way as it was working earlier. Yes I noticed a sort of differences between the two.(with inverseTranspose and without it (earlier one)).

Without inverse Transpose With inverseTranspose
WhatsApp Image 2024-01-16 at 02 26 11_19db5af5 inverseTranspose

I wonder if this is a bug in mapTextureToNormal that we didn't notice before because we couldn't rotate the spheremap texture like we can now. Is that something you're interested in looking into as part of this change? The idea is to convert a normal into a spherical coordinate (an azimuthal angle and a polar angle, and a fixed radius of 1), and then map those angles to 2D texture coordinates on the input image. Maybe we're doing that slightly incorrectly?

It's good that we catched this bug too. I will look into it and try to implement it.

@davepagurek
Copy link
Contributor

Update: I was messing around with the same effect in Blender, and it looks like maybe that behaviour with the sun that I was mentioning is actually just how mirror balls look for real.

blender.mp4

@davepagurek davepagurek merged commit 832612b into processing:main Jan 19, 2024
2 checks passed
@perminder-17 perminder-17 deleted the orbitControl branch January 19, 2024 02:02
@perminder-17
Copy link
Contributor Author

Thanks for merging it ❤

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

orbitControl() Compatibility with imageLight()
2 participants