Package org.lwjgl.ovr

Class OVRUtil



  • public class OVRUtil
    extends java.lang.Object
    Native bindings to the libOVR utility functions.
    • Field Detail

      • ovrProjection_None

        public static final int ovrProjection_None
        Use for generating a default projection matrix that is:
        • Right-handed.
        • Near depth values stored in the depth buffer are smaller than far depth values.
        • Both near and far are explicitly defined.
        • With a clipping range that is (0 to w).
        See Also:
        Constant Field Values
      • ovrProjection_LeftHanded

        public static final int ovrProjection_LeftHanded
        Enable if using left-handed transformations in your application.
        See Also:
        Constant Field Values
      • ovrProjection_FarLessThanNear

        public static final int ovrProjection_FarLessThanNear
        After the projection transform is applied, far values stored in the depth buffer will be less than closer depth values. NOTE: Enable only if the application is using a floating-point depth buffer for proper precision.
        See Also:
        Constant Field Values
      • ovrProjection_FarClipAtInfinity

        public static final int ovrProjection_FarClipAtInfinity
        When this flag is used, the zfar value pushed into Matrix4f_Projection will be ignored NOTE: Enable only if Projection_FarLessThanNear is also enabled where the far clipping plane will be pushed to infinity.
        See Also:
        Constant Field Values
      • ovrProjection_ClipRangeOpenGL

        public static final int ovrProjection_ClipRangeOpenGL
        Enable if the application is rendering with OpenGL and expects a projection matrix with a clipping range of (-w to w). Ignore this flag if your application already handles the conversion from D3D range (0 to w) to OpenGL.
        See Also:
        Constant Field Values
    • Method Detail

      • ovr_Detect

        public static OVRDetectResult ovr_Detect(int timeoutMilliseconds,
                                                 OVRDetectResult __result)
        Detects Oculus Runtime and Device Status.

        Checks for Oculus Runtime and Oculus HMD device status without loading the LibOVRRT shared library. This may be called before _Initialize to help decide whether or not to initialize LibOVR.

        Parameters:
        timeoutMilliseconds - a timeout to wait for HMD to be attached or 0 to poll
      • ovrMatrix4f_OrthoSubProjection

        public static OVRMatrix4f ovrMatrix4f_OrthoSubProjection(OVRMatrix4f projection,
                                                                 OVRVector2f orthoScale,
                                                                 float orthoDistance,
                                                                 float HmdToEyeOffsetX,
                                                                 OVRMatrix4f __result)
        Generates an orthographic sub-projection.

        Used for 2D rendering, Y is down.

        Parameters:
        projection - the perspective matrix that the orthographic matrix is derived from
        orthoScale - equal to 1.0f / pixelsPerTanAngleAtCenter
        orthoDistance - equal to the distance from the camera in meters, such as 0.8m
        HmdToEyeOffsetX - the offset of the eye from the center
        __result - the calculated projection matrix
      • ovr_CalcEyePoses

        public static void ovr_CalcEyePoses(OVRPosef headPose,
                                            OVRVector3f.Buffer HmdToEyeOffset,
                                            OVRPosef.Buffer outEyePoses)
        Computes offset eye poses based on headPose returned by OVRTrackingState.
        Parameters:
        headPose - indicates the HMD position and orientation to use for the calculation
        HmdToEyeOffset - can be OVREyeRenderDesc.HmdToEyeViewOffset returned from _GetRenderDesc. For monoscopic rendering, use a vector that is the average of the two vectors for both eyes.
        outEyePoses - if outEyePoses are used for rendering, they should be passed to _SubmitFrame in OVRLayerEyeFov::RenderPose or OVRLayerEyeFovDepth::RenderPose
      • ovr_GetEyePoses

        public static void ovr_GetEyePoses(long session,
                                           long frameIndex,
                                           boolean latencyMarker,
                                           OVRVector3f.Buffer hmdToEyeOffset,
                                           OVRPosef.Buffer outEyePoses,
                                           java.nio.DoubleBuffer outSensorSampleTime)
        Returns the predicted head pose in outHmdTrackingState and offset eye poses in outEyePoses.

        This is a thread-safe function where caller should increment frameIndex with every frame and pass that index where applicable to functions called on the rendering thread. Assuming outEyePoses are used for rendering, it should be passed as a part of OVRLayerEyeFov. The caller does not need to worry about applying HmdToEyeOffset to the returned outEyePoses variables.

        Parameters:
        session - an ovrSession previously returned by _Create
        frameIndex - the targeted frame index, or 0 to refer to one frame after the last time _SubmitFrame was called
        latencyMarker - Specifies that this call is the point in time where the "App-to-Mid-Photon" latency timer starts from. If a given ovrLayer provides "SensorSampleTimestamp", that will override the value stored here.
        hmdToEyeOffset - can be OVREyeRenderDesc.HmdToEyeOffset returned from _GetRenderDesc. For monoscopic rendering, use a vector that is the average of the two vectors for both eyes.
        outEyePoses - the predicted eye poses
        outSensorSampleTime - the time when this function was called. May be NULL, in which case it is ignored.
      • ovrPosef_FlipHandedness

        public static void ovrPosef_FlipHandedness(OVRPosef inPose,
                                                   OVRPosef outPose)
        Tracking poses provided by the SDK come in a right-handed coordinate system. If an application is passing in Projection_LeftHanded into Matrix4f_Projection, then it should also use this function to flip the HMD tracking poses to be left-handed.

        While this utility function is intended to convert a left-handed OVRPosef into a right-handed coordinate system, it will also work for converting right-handed to left-handed since the flip operation is the same for both cases.

        Parameters:
        inPose - a pose that is right-handed
        outPose - the pose that is requested to be left-handed (can be the same pointer to inPose)
      • ovr_GetEyePoses

        public static void ovr_GetEyePoses(long session,
                                           long frameIndex,
                                           boolean latencyMarker,
                                           OVRVector3f.Buffer hmdToEyeOffset,
                                           OVRPosef.Buffer outEyePoses,
                                           double[] outSensorSampleTime)
        Array version of: _GetEyePoses