VR空项目, 直接拷贝出来, 改项目名称就可以
Unity 2021.3.8flc1
XR Interaction Toolkit 2.1.1
PICO Unity Integration SDK v211 : Packages/PICO UnityXR Integration SDK v211 目录下
Oculus 未安装
HTC Vive 未安装
Oculus建议将纹理压缩改为ASTC
颜色控件 : 线性
自动图形API : 关闭
Active Input Handling 输入管理器 : 两个Both (为搜狗输入法)
\Assets\TextMesh Pro\Fonts 文件夹下 msyh.ttf : 微软雅黑 zh-cn.txt : 中文字库
Window→TextMeshPro→Font Asset Creator 字符集选txt, 尺寸8192, 方式SDFAA,点击Generate Font Atlas 完成Save
搜狗 VR 键盘
https://t7bkgg7632.feishu.cn/drive/home/
UnityXR_SogouKeyboard.unitypackage
Unity XR SougouKeyboard Package 集成指南
搜狗 VR 键盘仅适用于发布到中国大陆的应用程序。
目前仅支持使用inputFiled(Legacy),并不支持InputField (TMP)。
VR摄像机+两个射线手柄对象
交互管理类, **XR Origin(VR)**添加会自动添加
添加XR Origin(VR) 的时候会被自动添加到XR Origin(VR) /Camera Offset/Main Camera 下
XR Origin(VR) 会被自动添加到XR Origin(VR) /Camera Offset/LeftHand Controller 和 RightHand Controller 下
从XR输入子系统将跟踪的输入控制器设备上的特征值解释为XR交互状态,例如Select。此外,它将被跟踪设备的当前姿态值应用于(支持多个XRController,如左手可同时有一个Ray和一个Direct交互控制器,这样可以实现复杂交互,但小心交互者间层级设定)
通过插座定位可交互内容的交互者
带有射线的手柄模板对象(添加XR Origin(VR) 会自动添加左右手)
交互器用于直接与接触的交互设备进行交互。这是通过更新此交互器的当前有效目标集的触发器卷来处理的。这个组件必须有一个被设置为触发器的碰撞卷才能工作。
不具有射线的手柄模板对象, 需要添加碰撞盒去拿东西
最简单的交互对象,只是有一个具体的交互实现,一般用在去对交互事件进行相应,没有实际的交互手段。
允许基本“抓取”功能的可交互组件。可以连接到一个选择的交互者,并在遵守物理的同时跟随它(并在释放时继承速度)
组件添加后变为手柄可抓取物(包含上面的射线和非射线) 必须有碰撞盒(Collider)进行检测, 建议带刚体Rigibody抓取投掷
创建Canvas只有在此Canvas下的UI组件才能和手柄射线产生正常的交互 (射线长度为30m)
Canvas Scaler
Unit
Tracked Device Graphic Raycaster
行走,跑动画自动适应地形。全方向,全地形。
控制对XR Origin的访问。这个系统强制在同一时间只有一个移动供应商可以移动XR原点。这是访问XR Origin受控制的唯一地方,不推荐使用多个LocomotionSystem实例驱动单个XR Origin
地面传导移动(Anchor 传送固定点),射线检测到可传送地面后射线变色,默认按下 Trigger 键实现传送
传送管理类
转身管理类
连续旋转提供者
Input System Ul Actions
Point Action
Left Click Action
Middle Click Action
Right Click Action
Scroll Wheel Action
Navigate Action
Submit Action
Cancel Action
Character Controller 第三人称类
Character Controller Driver 控制移动上面的脚本, 不过转身好像不能移动, 需要修改, 继承后在update()里运行UpdateCharacterController(), 使Character Controller实时更新
using UnityEngine.XR.Interaction.Toolkit;
public class CharacterControllerDriverUpdate : CharacterControllerDriver
{
void Update()
{
UpdateCharacterController()
}
}
https://blog.csdn.net/m0_60290598/article/details/123521029
可以去官网下载, 在包管理器安装 Oculus Integration 34, 导入
OVRPlayController : 拖如场景就可以移动, 设置手柄模型 OVRPlayController→OVRCameraRig→TrackingSpace里的 LeftControllerAnchor下拖入CustomHandLeft 和 RightControllerAnchor下拖入CustomHandRight 在 Core/CustomHands/Models下, 拳头握不住修改动画Layers的ThumbLayer的Blending 为 Additive模式, 最终需要在代码里改
OVRControllerPrefab : 就是射线
OVR Grabber : 抓取,默认 CustomHandLeft / Right 带
OVR Grabbable : 添加到物体, 表示物体可以被抓, 也需要添加 Rigidbody
OVR Raycaster : UI 手柄交互, Canvas调到World Space
UIHelpers : Oculus→SampleFramework→Core→DebugUI→Prefabs→UIHelpers, 拖入, 会出现二个EventSystem,把外面的删除
Locomotion Controller
Locomotion Teleport : 移动的时候的光标(可能)
Teleport Input Handler Touch
Teleport Target Handler Physical
Teleport Aim Handler Laser : 线, 激光
Teleport Aim Handler Visual Parabolic: 线, 抛物线
Teleport Aim Handler Visual Laser : 线
Teleport Aim Visual Laser : 渲染线对象
传送方式 Teleport Transition Blink Teleport Transition Instant Teleport Transition Warp
Teleport Orlentation Handler Thumbatick 旋转
Pico Metrics Tool 1.2.0 性能监控, 打开, 开启后会实时监控
CurvedUI (曲面UI) Unity 可以将平面的UI拉伸成曲面效果
LuBan 数据 : 运行先安装dotNet SDK 6.0\
# Allow unsafe code in Player Settings to fix this error : 错误
# Edit → Project Settings → Player → 搜索 'unsafe' 然后勾上
# 安装
# luban_examples\Projects\Csharp_Unity_bin_or_json\Assets\LubanLib → \Assets\Plugins\Luban\LubanLib
# 安装工具库
# luban_examples\Tools\Luban.ClientServer → \Plugins\Luban\Tools
# luban_examples\MiniTemplate → \Plugins\Luban
# luban_examples\Projects\Csharp_Unity_bin_or_json 里的bat和sh复制到 \Plugins\Luban
HybridCLR 热更新 :
# 安装(粗鲁的方式)
从 gitee https://gitee.com/focus-creative-games/hybridclr_unity.git clone 到本地, 然后复制到 Packages 目录下
# 菜单 HybridCLR/Installer...
# Unity 2021 IOS 打包需要覆盖Unity文件
将文件 {package目录}/Data~/ModifiedUnityAssemblies/2021.3.x/UnityEditor.CoreModule-{Win,Mac}.dll
覆盖到 {Editor安装目录}/Editor/Data/Managed/UnityEngine/UnityEditor.CoreModule.dll
dll 制作方法 : https://focus-creative-games.github.io/hybridclr/modify_unity_dll/
Asset Bundle Browser
URP渲染管线
Pico Metrics Tool 翻译
UnityEngine.XR.CommonUsages
// Informs to the developer whether the device is currently being tracked.
<bool> isTracked = <bool>("IsTracked");
// The primary face button being pressed on a device, or sole button if only one is available.
<bool> primaryButton = <bool>("PrimaryButton");
// The primary face button being touched on a device.
<bool> primaryTouch = <bool>("PrimaryTouch");
// The secondary face button being pressed on a device.
<bool> secondaryButton = <bool>("SecondaryButton");
// The secondary face button being touched on a device.
<bool> secondaryTouch = <bool>("SecondaryTouch");
// 设备按下 Grip Button 捏合键
<bool> gripButton = <bool>("GripButton");
// 按下 TriggerButton 扳机键
<bool> triggerButton = <bool>("TriggerButton");
// 按下 MenuButton 菜单键, 用于暂停、返回或退出游戏
<bool> menuButton = <bool>("MenuButton");
// Represents the primary 2D axis being clicked or otherwise depressed.
<bool> primary2DAxisClick = <bool>("Primary2DAxisClick");
// Represents the primary 2D axis being touched.
<bool> primary2DAxisTouch = <bool>("Primary2DAxisTouch");
// Represents the secondary 2D axis being clicked or otherwise depressed.
<bool> secondary2DAxisClick = <bool>("Secondary2DAxisClick");
// Represents the secondary 2D axis being touched.
<bool> secondary2DAxisTouch = <bool>("Secondary2DAxisTouch");
// Use this property to test whether the user is currently wearing and/or interacting
// with the XR device. The exact behavior of this property varies with each type
// of device: some devices have a sensor specifically to detect user proximity,
// however you can reasonably infer that a user is present with the device when
// the property is UserPresenceState.Present.
<bool> userPresence = <bool>("UserPresence");
// Represents the values being tracked for this device.
<InputTrackingState> trackingState = <InputTrackingState>("TrackingState");
// Value representing the current battery life of this device.
<float> batteryLevel = <float>("BatteryLevel");
// A trigger-like control, pressed with the index finger.用食指按下的类似触发器的控件。
<float> trigger = <float>("Trigger");
// Represents the users grip on the controller.表示用户对控制器的抓地力。
<float> grip = <float>("Grip");
// The primary touchpad or joystick on a device.
<Vector2> primary2DAxis = <Vector2>("Primary2DAxis");
// A secondary touchpad or joystick on a device.
<Vector2> secondary2DAxis = <Vector2>("Secondary2DAxis");
// 设备的位置
<Vector3> devicePosition = <Vector3>("DevicePosition");
// 左眼在设备上的位置
<Vector3> leftEyePosition = <Vector3>("LeftEyePosition");
// 右眼在设备上的位置
<Vector3> rightEyePosition = <Vector3>("RightEyePosition");
// The position of the center eye on this device.
<Vector3> centerEyePosition = <Vector3>("CenterEyePosition");
// The position of the color camera on this device.
<Vector3> colorCameraPosition = <Vector3>("CameraPosition");
// The velocity of the device.
<Vector3> deviceVelocity = <Vector3>("DeviceVelocity");
// The angular velocity of this device, formatted as euler angles.
<Vector3> deviceAngularVelocity = <Vector3>("DeviceAngularVelocity");
// The velocity of the left eye on this device.
<Vector3> leftEyeVelocity = <Vector3>("LeftEyeVelocity");
// The angular velocity of the left eye on this device, formatted as euler angles.
<Vector3> leftEyeAngularVelocity = <Vector3>("LeftEyeAngularVelocity");
// The velocity of the right eye on this device.
<Vector3> rightEyeVelocity = <Vector3>("RightEyeVelocity");
// The angular velocity of the right eye on this device, formatted as euler angles.
<Vector3> rightEyeAngularVelocity = <Vector3>("RightEyeAngularVelocity");
// The velocity of the center eye on this device.
<Vector3> centerEyeVelocity = <Vector3>("CenterEyeVelocity");
// The angular velocity of the center eye on this device, formatted as euler angles.
<Vector3> centerEyeAngularVelocity = <Vector3>("CenterEyeAngularVelocity");
// The velocity of the color camera on this device.
<Vector3> colorCameraVelocity = <Vector3>("CameraVelocity");
// The angular velocity of the color camera on this device, formatted as euler angles.
<Vector3> colorCameraAngularVelocity = <Vector3>("CameraAngularVelocity");
// The acceleration of the device.
<Vector3> deviceAcceleration = <Vector3>("DeviceAcceleration");
// The angular acceleration of this device, formatted as euler angles.
<Vector3> deviceAngularAcceleration = <Vector3>("DeviceAngularAcceleration");
// The acceleration of the left eye on this device.
<Vector3> leftEyeAcceleration = <Vector3>("LeftEyeAcceleration");
// The angular acceleration of the left eye on this device, formatted as euler angles.
<Vector3> leftEyeAngularAcceleration = <Vector3>("LeftEyeAngularAcceleration");
// The acceleration of the right eye on this device.右眼在这个装置上的加速度。
<Vector3> rightEyeAcceleration = <Vector3>("RightEyeAcceleration");
// The angular acceleration of the right eye on this device, formatted as euler angles.
<Vector3> rightEyeAngularAcceleration = <Vector3>("RightEyeAngularAcceleration");
// The acceleration of the center eye on this device.
<Vector3> centerEyeAcceleration = <Vector3>("CenterEyeAcceleration");
// The angular acceleration of the center eye on this device, formatted as euler angles.
<Vector3> centerEyeAngularAcceleration = <Vector3>("CenterEyeAngularAcceleration");
// The acceleration of the color camera on this device.
<Vector3> colorCameraAcceleration = <Vector3>("CameraAcceleration");
// The angular acceleration of the color camera on this device, formatted as euler angles.
<Vector3> colorCameraAngularAcceleration = <Vector3>("CameraAngularAcceleration");
// 设备的旋转角度
<Quaternion> deviceRotation = <Quaternion>("DeviceRotation");
// The rotation of the left eye on this device.
<Quaternion> leftEyeRotation = <Quaternion>("LeftEyeRotation");
// The rotation of the right eye on this device.
<Quaternion> rightEyeRotation = <Quaternion>("RightEyeRotation");
// The rotation of the center eye on this device.
<Quaternion> centerEyeRotation = <Quaternion>("CenterEyeRotation");
// The rotation of the color camera on this device.
<Quaternion> colorCameraRotation = <Quaternion>("CameraRotation");
// Value representing the hand data for this device.
<Hand> handData = <Hand>("HandData");
// An Eyes struct containing eye tracking data collected from the device.
<Eyes> eyesData = <Eyes>("EyesData");
// A non-handed 2D axis.
[Obsolete("CommonUsages.dPad is not used by any XR platform and will be removed.")]
<Vector2> dPad = <Vector2>("DPad");
// Represents the grip pressure or angle of the index finger.
[Obsolete("CommonUsages.indexFinger is not used by any XR platform and will be removed.")]
<float> indexFinger = <float>("IndexFinger");
// Represents the grip pressure or angle of the middle finger.
[Obsolete("CommonUsages.MiddleFinger is not used by any XR platform and will be removed.")]
<float> middleFinger = <float>("MiddleFinger");
// Represents the grip pressure or angle of the ring finger.
[Obsolete("CommonUsages.RingFinger is not used by any XR platform and will be removed.")]
<float> ringFinger = <float>("RingFinger");
// Represents the grip pressure or angle of the pinky finger.
[Obsolete("CommonUsages.PinkyFinger is not used by any XR platform and will be removed.")]
<float> pinkyFinger = <float>("PinkyFinger");
// Represents a thumbrest or light thumb touch.
[Obsolete("CommonUsages.thumbrest is Oculus only, and is being moved to their package. Please use OculusUsages.thumbrest. These will still function until removed.")]
<bool> thumbrest = <bool>("Thumbrest");
// Represents a touch of the trigger or index finger.
[Obsolete("CommonUsages.indexTouch is Oculus only, and is being moved to their package. Please use OculusUsages.indexTouch. These will still function until removed.")]
<float> indexTouch = <float>("IndexTouch");
// Represents the thumb pressing any input or feature.
[Obsolete("CommonUsages.thumbTouch is Oculus only, and is being moved to their package. Please use OculusUsages.thumbTouch. These will still function until removed.")]
<float> thumbTouch = <float>("ThumbTouch");
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。