# qs_face_points_android **Repository Path**: qiansou/qs_face_points_android ## Basic Information - **Project Name**: qs_face_points_android - **Description**: 人脸检测,人脸68个特征点,人脸姿态HeadPose - **Primary Language**: Android - **License**: Not specified - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 5 - **Forks**: 2 - **Created**: 2018-07-31 - **Last Updated**: 2023-11-22 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README ![输入图片说明](https://cloud.githubusercontent.com/assets/16308037/24229391/1910e9cc-0fb4-11e7-987b-0fecce2c829e.JPG "在这里输入图片标题") ![输入图片说明](https://cloud.githubusercontent.com/assets/16308037/24230147/79bf1c68-0fb8-11e7-859b-8482f9b559a5.gif "在这里输入图片标题") ## QsFacePoints sdk 文件列表 1、SDK包: qsface-release.aar 2、模型文件:qiansou_68_face_landmarks.dat 3、Demo : facepoints ## sdk 使用说明 #### 1、特征文件拷贝到sdcard中,模型文件的路径定义在 Constants.getFaceShapeModelPath() #### 2、初始化人脸引擎 ``` QsFaceApi api = new QsFaceApi(); //Constants.getFaceShapeModelPath() 为模型文件 qiansou_68_face_landmarks.dat 的文件路径 api.initial(Constants.getFaceShapeModelPath()); ``` #### 3、调用Android摄像头进行人脸分析 ``` /** * 预览回调 */ private Camera.PreviewCallback previewCallback = new Camera.PreviewCallback() { @Override public void onPreviewFrame(byte[] data, Camera camera) { Camera.Size size = mCamera.getParameters().getPreviewSize(); try{ YuvImage image = new YuvImage(data, ImageFormat.NV21, size.width, size.height, null); if(image!=null){ ByteArrayOutputStream stream = new ByteArrayOutputStream(); image.compressToJpeg(new Rect(0, 0, size.width, size.height), 80, stream); Bitmap bmp = BitmapFactory.decodeByteArray(stream.toByteArray(), 0, stream.size()); //人脸检测、特征点定位、姿态估计 //QsFace ret = api.detect("/sdcard/temp.jpg");//对一张本地照片进行人脸分析 QsFace face = api.detectBitmap(bmp); String label = face.getLabel(); int rectLeft = face.getLeft(); int rectTop = face.getTop(); int rectRight = face.getRight(); int rectBottom = face.getBottom(); ArrayList landmarks = face.getFaceLandmarks(); Log.i("qsface","=====>"+label+","+rectLeft+","+rectTop+","+rectRight+","+rectBottom); Log.i("qsface","=====>"+landmarks.size()); //获取68个特征点 //详细参考:https://cloud.githubusercontent.com/assets/16308037/24229391/1910e9cc-0fb4-11e7-987b-0fecce2c829e.JPG ArrayList points = face.getFaceLandmarks(); for (Point p : points) { Log.i("p=>", points.toString()); } //headpose //详细参考:https://www.cnblogs.com/21207-iHome/p/6894128.html Log.i("qsface","===angle==>"+face.getPitch()+","+face.getYaw()+","+face.getRoll()); stream.close(); } }catch(Exception ex){ Log.e("Sys","Error:"+ex.getMessage()); } } }; ```