English|中文
This sample provides reference for you to learn the Ascend AI Software Stack and cannot be used for commercial purposes.
This README file provides only guidance for running the sample in command line (CLI) mode. For details about how to run the sample in MindStudio, see Running Image Samples in MindStudio.
Function: Use the 3DCNN model to perform classification inference on data. Input: Use the video2bin script to convert a video file to a data file. Output: confidence of 10 actions.
Check whether the following requirements are met. If not, perform operations according to the remarks. If the CANN version is upgraded, check whether the third-party dependencies need to be reinstalled. (The third-party dependencies for 5.0.4 and later versions are different from those for earlier versions.)
Item | Requirement | Remarks |
---|---|---|
CANN version | ≥ 5.0.4 | Install the CANN by referring to Sample Deployment in the About Ascend Samples Repository. If the CANN version is earlier than the required version, switch to the samples repository specific to the CANN version. See Release Notes. |
Hardware | Atlas 200 DK/Atlas 300 (AI1s) | Currently, the Atlas 200 DK and Atlas 300 have passed the test. For details about the product description, see Hardware Platform. For other products, adaptation may be required. |
Third-party dependency | python-acllite | Select required dependencies. See Third-Party Dependency Installation Guide (Python Sample). |
Obtain the source package.
You can download the source code in either of the following ways:
# In the development environment, run the following commands as a non-root user to download the source repository:
cd ${HOME}
git clone https://gitee.com/ascend/samples.git
git checkout v0.5.0
# 1. Click Clone or Download in the upper right corner of the samples repository and click Download ZIP.
# 2. Upload the .zip package to the home directory of a common user in the development environment, for example, ${HOME}/ascend-samples-master.zip.
# 3. In the development environment, run the following commands to unzip the package:
cd ${HOME}
unzip ascend-samples-master.zip
Obtain the source network model required by the application.
Model | Description | How to Obtain |
---|---|---|
3DCNN | 3D action recognition model. It is a 3DCNN model based on TensorFlow. | Download the model and weight files by referring to the links in README.md in the ATC_3DCNN_tensorflow_AE directory of the ModelZoo repository. |
# To facilitate download, the commands for downloading the original model and converting the model are provided here. You can directly copy and run the commands. You can also refer to the above table to download the model from ModelZoo and manually convert it.
cd ${HOME}/samples/python/contrib/3Dgesture_recognition/model
wget https://obs-9be7.obs.cn-east-2.myhuaweicloud.com/003_Atc_Models/AE/ATC%20Model/3D_gesture_recognition/3d_gesture_recognition.pb
atc --model=3d_gesture_recognition.pb --framework=3 --output=3d_gesture_recognition --soc_version=Ascend310 --input_shape="X:1,16,112,112,3" --input_format=NDHWC
Obtain the test images required by the sample.
Run the following commands to go to the **data** folder of the sample and download the corresponding test images:
cd $HOME/samples/python/contrib/3Dgesture_recognition/data
wget https://obs-9be7.obs.cn-east-2.myhuaweicloud.com/003_Atc_Models/AE/ATC%20Model/3D_gesture_recognition/testdata/test_float32_actiontype7.bin
cd ../src
Note: If the development environment and operating environment are set up on the same server, skip step 1 and go to step 2 directly.
Run the following commands to upload the 3Dgesture_recognition directory in the development environment to any directory in the operating environment, for example, /home/HwHiAiUser, and log in to the operating environment (host) as the running user (HwHiAiUser):
# In the following information, xxx.xxx.xxx.xxx is the IP address of the operating environment. The IP address of Atlas 200 DK is 192.168.1.2 when it is connected over the USB port, and that of Atlas 300 (AI1s) is the corresponding public IP address.
scp -r $HOME/samples/python/contrib/3Dgesture_recognition HwHiAiUser@xxx.xxx.xxx.xxx:/home/HwHiAiUser
ssh HwHiAiUser@xxx.xxx.xxx.xxx
cd ${HOME}/3Dgesture_recognition/src
python3.6 3Dgesture_recognition.py ../data/
After the running is complete, the confidence of the 10 actions is printed on the screen.
For details about how to rectify the errors, see Troubleshooting. If an error is not included in Wiki, submit an issue to the samples repository.
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。