Class OVRUtil
- java.lang.Object
-
- org.lwjgl.ovr.OVRUtil
-
public class OVRUtil extends java.lang.ObjectNative bindings to the libOVR utility functions.
-
-
Field Summary
Fields Modifier and Type Field and Description static intovrProjection_ClipRangeOpenGLEnable if the application is rendering with OpenGL and expects a projection matrix with a clipping range of (-w to w).static intovrProjection_FarClipAtInfinityWhen this flag is used, the zfar value pushed intoMatrix4f_Projectionwill be ignored NOTE: Enable only ifProjection_FarLessThanNearis also enabled where the far clipping plane will be pushed to infinity.static intovrProjection_FarLessThanNearAfter the projection transform is applied, far values stored in the depth buffer will be less than closer depth values.static intovrProjection_LeftHandedEnable if using left-handed transformations in your application.static intovrProjection_NoneUse for generating a default projection matrix that is: Right-handed. Near depth values stored in the depth buffer are smaller than far depth values. Both near and far are explicitly defined. With a clipping range that is (0 to w).
-
Method Summary
All Methods Static Methods Concrete Methods Modifier and Type Method and Description static voidovr_CalcEyePoses(OVRPosef headPose, OVRVector3f.Buffer HmdToEyeOffset, OVRPosef.Buffer outEyePoses)Computes offset eye poses based onheadPosereturned byOVRTrackingState.static OVRDetectResultovr_Detect(int timeoutMilliseconds, OVRDetectResult __result)Detects Oculus Runtime and Device Status.static voidovr_GetEyePoses(long session, long frameIndex, boolean latencyMarker, OVRVector3f.Buffer hmdToEyeOffset, OVRPosef.Buffer outEyePoses, double[] outSensorSampleTime)Array version of:_GetEyePosesstatic voidovr_GetEyePoses(long session, long frameIndex, boolean latencyMarker, OVRVector3f.Buffer hmdToEyeOffset, OVRPosef.Buffer outEyePoses, java.nio.DoubleBuffer outSensorSampleTime)Returns the predicted head pose inoutHmdTrackingStateand offset eye poses inoutEyePoses.static OVRMatrix4fovrMatrix4f_OrthoSubProjection(OVRMatrix4f projection, OVRVector2f orthoScale, float orthoDistance, float HmdToEyeOffsetX, OVRMatrix4f __result)Generates an orthographic sub-projection.static OVRMatrix4fovrMatrix4f_Projection(OVRFovPort fov, float znear, float zfar, int projectionModFlags, OVRMatrix4f __result)Used to generate projection fromovrEyeDesc::Fov.static voidovrPosef_FlipHandedness(OVRPosef inPose, OVRPosef outPose)Tracking poses provided by the SDK come in a right-handed coordinate system.static OVRTimewarpProjectionDescovrTimewarpProjectionDesc_FromProjection(OVRMatrix4f projection, int projectionModFlags, OVRTimewarpProjectionDesc __result)Extracts the required data from the result ofMatrix4f_Projection.
-
-
-
Field Detail
-
ovrProjection_None
public static final int ovrProjection_None
Use for generating a default projection matrix that is:- Right-handed.
- Near depth values stored in the depth buffer are smaller than far depth values.
- Both near and far are explicitly defined.
- With a clipping range that is (0 to w).
- See Also:
- Constant Field Values
-
ovrProjection_LeftHanded
public static final int ovrProjection_LeftHanded
Enable if using left-handed transformations in your application.- See Also:
- Constant Field Values
-
ovrProjection_FarLessThanNear
public static final int ovrProjection_FarLessThanNear
After the projection transform is applied, far values stored in the depth buffer will be less than closer depth values. NOTE: Enable only if the application is using a floating-point depth buffer for proper precision.- See Also:
- Constant Field Values
-
ovrProjection_FarClipAtInfinity
public static final int ovrProjection_FarClipAtInfinity
When this flag is used, the zfar value pushed intoMatrix4f_Projectionwill be ignored NOTE: Enable only ifProjection_FarLessThanNearis also enabled where the far clipping plane will be pushed to infinity.- See Also:
- Constant Field Values
-
ovrProjection_ClipRangeOpenGL
public static final int ovrProjection_ClipRangeOpenGL
Enable if the application is rendering with OpenGL and expects a projection matrix with a clipping range of (-w to w). Ignore this flag if your application already handles the conversion from D3D range (0 to w) to OpenGL.- See Also:
- Constant Field Values
-
-
Method Detail
-
ovr_Detect
public static OVRDetectResult ovr_Detect(int timeoutMilliseconds, OVRDetectResult __result)
Detects Oculus Runtime and Device Status.Checks for Oculus Runtime and Oculus HMD device status without loading the LibOVRRT shared library. This may be called before
_Initializeto help decide whether or not to initialize LibOVR.- Parameters:
timeoutMilliseconds- a timeout to wait for HMD to be attached or 0 to poll
-
ovrMatrix4f_Projection
public static OVRMatrix4f ovrMatrix4f_Projection(OVRFovPort fov, float znear, float zfar, int projectionModFlags, OVRMatrix4f __result)
Used to generate projection fromovrEyeDesc::Fov.- Parameters:
fov- theOVRFovPortto useznear- distance to near Z limitzfar- distance to far Z limitprojectionModFlags- a combination of theovrProjectionModifierflags. One or more of:Projection_NoneProjection_FarLessThanNearProjection_FarClipAtInfinityProjection_ClipRangeOpenGL__result- the calculated projection matrix
-
ovrTimewarpProjectionDesc_FromProjection
public static OVRTimewarpProjectionDesc ovrTimewarpProjectionDesc_FromProjection(OVRMatrix4f projection, int projectionModFlags, OVRTimewarpProjectionDesc __result)
Extracts the required data from the result ofMatrix4f_Projection.- Parameters:
projection- the project matrix from which to extractOVRTimewarpProjectionDescprojectionModFlags- a combination of the ovrProjectionModifier flags. One or more of:Projection_NoneProjection_FarLessThanNearProjection_FarClipAtInfinityProjection_ClipRangeOpenGL__result- the extracted ovrTimewarpProjectionDesc
-
ovrMatrix4f_OrthoSubProjection
public static OVRMatrix4f ovrMatrix4f_OrthoSubProjection(OVRMatrix4f projection, OVRVector2f orthoScale, float orthoDistance, float HmdToEyeOffsetX, OVRMatrix4f __result)
Generates an orthographic sub-projection.Used for 2D rendering, Y is down.
- Parameters:
projection- the perspective matrix that the orthographic matrix is derived fromorthoScale- equal to1.0f / pixelsPerTanAngleAtCenterorthoDistance- equal to the distance from the camera in meters, such as 0.8mHmdToEyeOffsetX- the offset of the eye from the center__result- the calculated projection matrix
-
ovr_CalcEyePoses
public static void ovr_CalcEyePoses(OVRPosef headPose, OVRVector3f.Buffer HmdToEyeOffset, OVRPosef.Buffer outEyePoses)
Computes offset eye poses based onheadPosereturned byOVRTrackingState.- Parameters:
headPose- indicates the HMD position and orientation to use for the calculationHmdToEyeOffset- can beOVREyeRenderDesc.HmdToEyeViewOffsetreturned from_GetRenderDesc. For monoscopic rendering, use a vector that is the average of the two vectors for both eyes.outEyePoses- ifoutEyePosesare used for rendering, they should be passed to_SubmitFrameinOVRLayerEyeFov::RenderPoseorOVRLayerEyeFovDepth::RenderPose
-
ovr_GetEyePoses
public static void ovr_GetEyePoses(long session, long frameIndex, boolean latencyMarker, OVRVector3f.Buffer hmdToEyeOffset, OVRPosef.Buffer outEyePoses, java.nio.DoubleBuffer outSensorSampleTime)Returns the predicted head pose inoutHmdTrackingStateand offset eye poses inoutEyePoses.This is a thread-safe function where caller should increment
frameIndexwith every frame and pass that index where applicable to functions called on the rendering thread. AssumingoutEyePosesare used for rendering, it should be passed as a part ofOVRLayerEyeFov. The caller does not need to worry about applyingHmdToEyeOffsetto the returnedoutEyePosesvariables.- Parameters:
session- anovrSessionpreviously returned by_CreateframeIndex- the targeted frame index, or 0 to refer to one frame after the last time_SubmitFramewas calledlatencyMarker- Specifies that this call is the point in time where the "App-to-Mid-Photon" latency timer starts from. If a givenovrLayerprovides "SensorSampleTimestamp", that will override the value stored here.hmdToEyeOffset- can beOVREyeRenderDesc.HmdToEyeOffsetreturned from_GetRenderDesc. For monoscopic rendering, use a vector that is the average of the two vectors for both eyes.outEyePoses- the predicted eye posesoutSensorSampleTime- the time when this function was called. May be NULL, in which case it is ignored.
-
ovrPosef_FlipHandedness
public static void ovrPosef_FlipHandedness(OVRPosef inPose, OVRPosef outPose)
Tracking poses provided by the SDK come in a right-handed coordinate system. If an application is passing inProjection_LeftHandedintoMatrix4f_Projection, then it should also use this function to flip the HMD tracking poses to be left-handed.While this utility function is intended to convert a left-handed OVRPosef into a right-handed coordinate system, it will also work for converting right-handed to left-handed since the flip operation is the same for both cases.
- Parameters:
inPose- a pose that is right-handedoutPose- the pose that is requested to be left-handed (can be the same pointer toinPose)
-
ovr_GetEyePoses
public static void ovr_GetEyePoses(long session, long frameIndex, boolean latencyMarker, OVRVector3f.Buffer hmdToEyeOffset, OVRPosef.Buffer outEyePoses, double[] outSensorSampleTime)Array version of:_GetEyePoses
-
-