Newsletter

Oculus Quset2 development on Unity (2)-Playing baseball in VR

2021-05-30 15:05:27 Reading: 10 source:
the Internet

label:Scene Demo Quset2 Oculus Object Grabbing VR

After installing the Oculus plug-in, you can see all the resources and scripts contained in the plug-in under the Assets/Oculus/ folder.Of course, if you have patience, you can cooperateOfficial documentLet’s look at them one by one. On the other hand, there are several official demo scenes under Assets/Oculus/SampleFramework/Usage, which implement and demonstrate some basic functions. You can directly pack the scene out and put it on the real machine to see the effect.
There are several scenarios:
1.AppDeeplink: Demonstrate linking from in-game to other apps
2.CustomControllers: Demonstrate VR camera and Touch controller
3.CustomHands: Demonstration of VR camera and virtual hands
4. DebugUI: Demonstrate the Debug display in the scene, because Quest2 seems to be packaged for testing.
5.DistanceGrab: Demonstrate remote grabbing
6.HandsInteractionTrainScene: Demonstrate hand tracking. Note that when packaging, change Hand Tracking Support to Hands only or Controller And Hands. By default, Controller only has no effect.I feel that this is the most interesting of a few demo scenes
7. Locomotion: Demonstrate the basic movement method
8.MixedRealityCapture: A demonstration of mixed reality, I haven’t tried it. It should be used as a demonstration of externally shooting video to synthesize portraits into game scenes.
9. OVROverlay: Demonstration of UIOverLay effect. Turning on OverLay will allow the UI to always be displayed at the top, which may cause a sense of violation of the UI due to the distance mismatch.

This time we use the CustomHands scene to realize the basic interaction in VR-watching and grabbing, and finally a basic demonstration similar to playing baseball.
First copy the CustomHands scene and rename it to another name, then double-click to open it.
There is only a prefab instance of the floor and OVRCameraRig in the scene.

1. Realize the function of seeing

OVRCameraRig is an Oculus optimized VR camera, which is the most basic object used to realize VR display in the Oculus plug-in, to replace the unity native Camera. In the structure, there are 6 Anchors used to track the left and right eyes, the center of the two eyes, the body and the left and right controllers. As long as you put an object in the scene, you can realize the basic VR function of looking around.

Then look at the script on OVRCameraRig.
OVRCameraRig.cs //Script used to control 3D rendering and head-mounted display positioning
Insert picture description hereOVRManager.cs //Main interface of VR headset
Insert picture description hereOVRHeadsetEmulator.cs //Used to simulate the rotation of the headset when developing in Unity

So far, the function of seeing has been realized.

2. Realize touch and catch

There are additional prefab examples under LeftHandAnchor/RightHandAnchor of OVRCameraRig-CustomHandLeft/Right to realize hand simulation.Let’s take a look at the structure of this object
Insert picture description here
The main control script is hung on the root object, one is OVRGrabber.cs to realize grasping, and the other is Hand.cs to control the animation of the hand model.
GripTrans controls the position of the grasped object
GrabVolume is a collision box used to detect grabbing.
The last one is the hand model

1. Touch function:
Since there are already collision boxes on the rigid body and sub-objects on the hand prefab, it can automatically collide with the objects with the collision box.
2. Grasping function
The OVRGrabber script implemented in the plug-in has also been hung on this prefab.
Insert picture description hereHowever, the grabbing script alone is not enough. The OVRGrabbable script that can be grabbed must be hung on the grabbed object, and the grabbed object must have a collision box and a rigid body.
Insert picture description hereThus, the two functions of watching and grasping are realized.

3. Make a baseball demo

1. Make a ball
Drag a Sphere in the scene, and hang RigidBody and sphereCollider on it.
In order to achieve elastic collision, create a physical material and set bounciness to 1.
2. Make a serve
Drag a Cube in the scene, and then drag an empty object under this cube as the launch point, and place it in place. Write a script that fires a ball every few seconds.

public class PitchingMachine : MonoBehaviour
{
    [Tooltip("球的引用")]
    public Rigidbody Ball;
    [Tooltip("发射点")]
    public Transform Point;
    [Tooltip("发射间隔时间")]
    public float delayTime;
    [Tooltip("发射力量")]
    public float force;

    private float timer;
    private void Awake()
    {
        timer = delayTime;
    }
    void Update()
    {
        timer += Time.deltaTime;
        if (timer >= delayTime)
        {
            Ball.gameObject.SetActive(true);
            Ball.transform.position = Point.position;
            Ball.AddForce(Vector3.back * force,ForceMode.Impulse);
            timer = 0;
        }
    }
}

3. Make a bat and adjust the grabbing settings
Drag a cube and resize it to form a bat shape.
Attach the OVRGrabbable script and turn off AllowOffHandGrab above.
Then open the ParentHeldObject on the two-hand model.

4. Pack and run
Insert picture description here

label:Scene, Demo, Quset2, Oculus, Object, Grab, VR Source: https://blog.csdn.net/farcor_cn/article/details/117388714

.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending