说明:相关解读文章,地址见这里 配合阅读,效果更佳.
This sample demonstrates using SceneKit content with ARKit together to achieve an interactive AR experience.
本示例程序使用了SceneKit和ARkit来共同实现交互式的AR体验.
It uses ARKit to display an animated chameleon in a camera view. You can interact with the chameleon by touching it, and you can move it to new locations by tapping at a place or dragging the chameleon. The model also reacts to the user's movement based on the relative position and proximity of the camera.
它使用ARKit在摄像机视图上显示出一个动态的变色龙.你可以通过触摸与它交互,并可通过点击或拖拽来移动变色龙.同时,模型还会根据用户的相对位置的移动不断变化.
This sample demonstrates the following concepts:
本示例演示了以下概念:
ARSCNView
extension in the file Extensions.swift
).ARSCNView
的扩展Extensions.swift
)The interactive chameleon in this sample reacts to various events based on the rendering loop and interactions by the user. These actions are triggered in the following methods in Chameleon.swift
:
本示例中的变色龙会对渲染循环及用户交互产生反应.这些动作是在Chameleon.swift
中以下方法中触发的:
reactToInitialPlacement(in:)
: Called when a plane was detected and the chameleon is initially placed in the scene.reactToInitialPlacement(in:)
: 当检测到平面,并且变色龙初次放置在场景中时调用.reactToPositionChange(in:)
: Called when the user moved the chameleon to a new location.reactToPositionChange(in:)
: 当用户将变色龙移动到新位置时调用.reactToTap(in:)
: Called when the user touched the chameleon.reactToTap(in:)
: 当用户触摸变色龙时调用.reactToRendering(in:)
: Called at every frame at the beginning of a new rendering cycle. Used to control head and body turn animations based on the camera pose.reactToRendering(in:)
: 每一帧的渲染循环开始时调用.用来根据摄像头的位置来控制头部与身体的转运动画.reactToDidApplyConstraints(in:)
: Called at every frame after all constraints have been applied. Used to update the position of the tongue.reactToDidApplyConstraints(in:)
: 每一帧的所有约束应用完成后调用.用来更新舌头的位置.Plane detection is used to identify a horizontal surface on which the chameleon can be placed.
平面检测是用来探测放置变色龙的平面的.
Once a plane has been found, the chameleon's transform is set to the plane anchor's transform in the renderer(_:didAdd:for:)
method.
当发现一个平面时,变色龙的变换矩阵就会在renderer(_:didAdd:for:)
方法中被设置为等于平面锚点的变换矩阵.
For reasons of simplicity in this sample, the model has already been invisibly loaded into the scene since the beginning, and the node's hidden
property is set to false
to display it. In a more complex scene, you could asynchronously load the content when needed.
为了简化代码,模型一开始就被加载到场景中了,只是不可见状态,将节点的hidden
属性设置为false
就可以显示出来.在更复杂的场景中,你可以根据需要来异步加载内容.
The chameleon's eyes focus on the camera by a SCNLookAtConstraint
. For a more natural saccadic eye movement, a random offset is additionally applied to the each eye's pose.
通过使用SCNLookAtConstraint
来让变色龙的眼睛始终对着摄像机.为了让眼睛动作更自然,给每只眼睛的位置添加了一个随机值.
In each frame, the chameleon's position in relation to the camera is computed (see reactToRendering(in:)
) to determine whether the user is within the chameleon's field of view. In that case, the chameleon moves its head (with some delay) to look at the user (see handleWithinFieldOfView(localTarget:distance:)
). In this method, it is also checked whether
每一帧中,计算变色龙与摄像机之间的相对位置(见reactToRendering(in:)
)来确定用户是否在变色龙的视场中.如果在,就移动头部(稍有延迟)来面对用户(见handleWithinFieldOfView(localTarget:distance:)
).在这个方法中,还需要检测是否
The head movement is realized by a SCNLookAtConstraint
.
头部运动是用SCNLookAtConstraint
实现的.
If the camera pose is such that the chameleon cannot turn its head to face the user, a turn animation is triggered to obtain a better position (see playTurnAnimation(_:)
).
如果摄像机的位置太偏,变色龙的头转不到,会触发一个转身的动画来获得更好的位置(见playTurnAnimation(_:)
).
When shooting the tongue, it has to be ensured that it moves towards to user and sticks to the screen even when the camera moves. For that reason, the tongue's position must be updated each frame. This is done in reactToDidApplyConstraints(in:)
to ensure that this happens after other animations, like head rotation, have already been applied.
当射出舌头时,必须确保是朝向用户方向射出的,并且当摄像机移动时舌尖仍是粘在屏幕上.因此,舌头的位置必须每帧都更新.这些是在reactToDidApplyConstraints(in:)
中完成的,以确保其他动画,如头部旋转,已经应用了.
Chameleons can change color to adapt to the environment. This is done upon initial placement and when the chameleon is moved (see activateCamouflage(_:)
and updateCamouflage(_:)
).
变色龙可以根据环境来改变颜色.这是在初次放置或位置被移动后完成的(见activateCamouflage(_:)
和 updateCamouflage(_:)
)
The camouflage color is determined by retrieving an average color in a patch taken from the center of the current camera image (see averageColorFromEnvironment(at:)
in Extensions.swift
).
伪装色是根据当前摄像头画面中间部分截取后,求取平均值来得到的(见Extensions.swift
中的 averageColorFromEnvironment(at:)
).
The camouflage is then applied by modifying two variables in a Metal shader:
随后,修改Metal着色器的两个变量,得到伪装色:
blendFactor
allows to blend between an opaque colorful texture, and a semitransparent texture which can be combined with a uniform color.blendFactor
允许混合一个透明的彩色纹理,和一个结合了全局颜色的半透明纹理.skinColorFromEnvironment
sets the base color that shines through the transparent parts of the texture, creating a skin tone that is dominated by this color.skinColorFromEnvironment
设置透明纹理下的基础颜色,这个基础颜色构成皮肤的主导色.Xcode 9 and iOS 11 SDK
iOS 11 or later
ARKit requires an iOS device with an A9 or later processor.
Copyright (C) 2017 Apple Inc. All rights reserved.
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。