# TestMeidiaPipeIOS
**Repository Path**: yangyuanwei/test-meidia-pipe-ios
## Basic Information
- **Project Name**: TestMeidiaPipeIOS
- **Description**: No description available
- **Primary Language**: Objective-C
- **License**: Not specified
- **Default Branch**: master
- **Homepage**: None
- **GVP Project**: No
## Statistics
- **Stars**: 0
- **Forks**: 2
- **Created**: 2022-03-29
- **Last Updated**: 2022-03-29
## Categories & Tags
**Categories**: Uncategorized
**Tags**: None
## README
# 一、配置安装环境和下载代码
- 安装Homebrew和Xcode。安装Xcode的Command Line Tools `xcode-select --install`。
- 通过homebrew安装bazel,Bazel版本为3.4或更高。我一开始安装的版本是3.1.0,由于升级过系统和网络问题,无法更新homebrew库,重新设置国内源后可以升级到4.0.0
```
// 安装bazel
brew install bazel
// 查看bazel版本
bazel version
// 升级bazel
brew update
brew upgrade bazel
```
更新brew或bazel报错homebrew-core is a shallow clone.homebrew-cask is a shallow clone
```
git -C /usr/local/Homebrew/Library/Taps/homebrew/homebrew-core fetch --unshallow
git -C /usr/local/Homebrew/Library/Taps/homebrew/homebrew-cask fetch --unshallow
brew update
```
使用git -C /usr/local/Homebrew/Library/Taps/homebrew/homebrew-core fetch --unshallow报错http 传输不支持 shalllow 能力,更换brew源就可以了,而且速度更快
```
# 替换brew.git:
$ cd "$(brew --repo)"
# 中国科大:
$ git remote set-url origin https://mirrors.ustc.edu.cn/brew.git
# 清华大学:
$ git remote set-url origin https://mirrors.tuna.tsinghua.edu.cn/git/homebrew/brew.git
# 替换homebrew-core.git:
$ cd "$(brew --repo)/Library/Taps/homebrew/homebrew-core"
# 中国科大:
$ git remote set-url origin https://mirrors.ustc.edu.cn/homebrew-core.git
# 清华大学:
$ git remote set-url origin https://mirrors.tuna.tsinghua.edu.cn/git/homebrew/homebrew-core.git
#替换homebrew-cask.git
$ cd "$(brew --repo)/Library/Taps/homebrew/homebrew-cask"
# 中国科大:
$ git remote set-url origin https://mirrors.ustc.edu.cn/homebrew-cask.git
# 清华大学:
$ git remote set-url origin https://mirrors.tuna.tsinghua.edu.cn/git/homebrew/homebrew-cask.git
# 替换homebrew-bottles:
# 中国科大:
$ echo 'export HOMEBREW_BOTTLE_DOMAIN=https://mirrors.ustc.edu.cn/homebrew-bottles' >> ~/.bash_profile
$ source ~/.bash_profile
# 清华大学:
$ echo 'export HOMEBREW_BOTTLE_DOMAIN=https://mirrors.tuna.tsinghua.edu.cn/homebrew-bottles' >> ~/.bash_profile
$ source ~/.bash_profile
# 应用生效:
$ brew update
```
- 下载MediaPipe代码
```
git clone https://github.com/google/mediapipe.git
```
报错SSL_ERROR_SYSCALL in connection to github.com:443。是我设置的dns有问题,把dns清除掉就可以了,或者使用gitee的地址。
```
git clone https://gitee.com/mirrors/mediapipe.git
```
- 安装OpenCV和FFmpeg,通过HomeBrew进行安装OpenCV,OpenCV好像有包含FFmpeg
```
$ brew install opencv@3
# There is a known issue caused by the glog dependency. Uninstall glog.
$ brew uninstall --ignore-dependencies glog
```
- 确认python 3 和它的“six”库已经安装。
```
$ brew install python3
$ which python3
/usr/local/bin/python3
$ vim ~/.bash_profile
添加 alias python="/usr/local/bin/python3"
$ source ~/.bash_profile
$ python --version
Python 3.9.2
$ pip3 install --user six
```
- 运行hellow world测试配置环境没问题。
```
$ export GLOG_logtostderr=1
# Need bazel flag 'MEDIAPIPE_DISABLE_GPU=1' as desktop GPU is currently not supported
$ bazel run --define MEDIAPIPE_DISABLE_GPU=1 \
mediapipe/examples/desktop/hello_world:hello_world
# Should print:
# Hello World!
# Hello World!
# Hello World!
# Hello World!
# Hello World!
# Hello World!
# Hello World!
# Hello World!
# Hello World!
# Hello World!
```
中间会从国外网上对比并下载资源,因此需要设置代理才行。端口是看自己翻墙软件设置。
```
#在命令行模式下设置代理
export http_proxy=http://127.0.0.1:1087;export https_proxy=http://127.0.0.1:1087;
#curl命令可以在命令行模式下测试http连接效果
curl www.google.com
#如果代理软件是pac设置,按照填写的规则添加下面网址
github.com
githubusercontent.com
githubapp.com
mirror.bazel.build
jcenter.bintray.com
maven.google.com
dl.google.com/dl/android/maven2
repo1.maven.org/maven2
```
# 二、运行MediaPipe的ios的官方demo工程
- 参考网址https://google.github.io/mediapipe/getting_started/ios.html
- 设置bundle ID prefix。运行mediapipe提供的脚本可以设置,但是需要配合后面步骤的Automatic provisioning。 [Custom provisioning](https://google.github.io/mediapipe/getting_started/ios.html#custom-provisioning)不需要运行这个脚本,而是进行其他设置。
```
python3 mediapipe/examples/ios/link_local_profiles.py
```
- 安装Tulsi工具,Tulsi工具可以根据Bazel的build配置文件生成对应的Xcode工程
```
# cd out of the mediapipe directory, then:
git clone https://github.com/bazelbuild/tulsi.git
cd tulsi
# remove Xcode version from Tulsi's .bazelrc (see http://github.com/bazelbuild/tulsi#building-and-installing):
sed -i .orig '/xcode_version/d' .bazelrc
# build and run Tulsi:
sh build_and_run.sh
```
- 打开Tulsi软件,点击open existing project,选择mediapipe/Mediapipe.tulsiproj进行打开
- 在弹出页面中,选择“Configs”选项,选中“MediaPipe”,然后点击“Generate”后选择要存放工程的路径,最后开始生成工程
- - 中途报错Error in run_shell: 'command' must be of type string. passing a sequence of strings as 'command' is deprecated. To temporarily disable this check, set --incompatible_run_shell_command_string=false.我的bazel版本是4.0.0,要换成3.7.x
```
#卸载以前的bazel
brew uninstall bazel
#下载bazel3.7.2版本代码
export BAZEL_VERSION=3.7.2
curl -fLO "https://github.com/bazelbuild/bazel/releases/download/${BAZEL_VERSION}/bazel-${BAZEL_VERSION}-installer-darwin-x86_64.sh"
#安装bazel
chmod +x "bazel-${BAZEL_VERSION}-installer-darwin-x86_64.sh"
./bazel-${BAZEL_VERSION}-installer-darwin-x86_64.sh --user
#设置环境,默认安装目录是$HOME/bin
sudo vim ~/.bash_profile
export PATH="$PATH:$HOME/bin"
source ~/.bash_profile
#不知道为什么修改了之后,感觉使用Tulsi慢好多,“Generate”要等一段时间才能点击
```
- 中途有出现Error in fail: Invalid character(s) in bundle_id: "*SEE_IOS_INSTRUCTIONS*.mediapipe.examples.FaceDetectionCpu"
```
python3 mediapipe/examples/ios/link_local_profiles.py
```
- 中途出现下载失败的情况,应该是无法访问和下载一些网站的资源,需要设置代理,方法是在命令行设置代理,从命令行打开Tulsi
```
export http_proxy=http://127.0.0.1:1087;export https_proxy=http://127.0.0.1:1087;
open ~/Applications/Tulsi.app/
```
- 生成provisioning profile,安装ios app必须要provisioning profile,有Automatic provisioning和Custom provisioning两种。Automatic是通过xcode自动创建的,一般安装到手机设备后一个星期后就会过期,需要重新安装。Custom是通过开发者账号创建的,安装到设备后有很长的有效期,但是需要付钱来获取苹果开发者账号。我们只是测试效果,所以使用automatic的方式。1.打开通过上面步骤创建的MediaPipe的demo工程,在左侧点击“Mediapipe”工程。2.在targets中选择一个,比如HandTrackingGpuApp(注意每一个target都需要设置,按顺序选择就好)。3.选择“Signing & Capabilities” 选项。4.选中“Automatically manage signing”,弹窗提示时选择确认。5.Team选项选择好个人账户。6.重复2-5步骤,把所有targets都设置好。7.在mediapipe代码目录下运行脚本,把automatic创建的provisioning profile连接上。
```
#once a profile expires, Xcode will generate a new one; you must then run this script again to link the updated profiles.
python3 mediapipe/examples/ios/link_local_profiles.py
```
- 选择需要的target进行编译运行。发现报错//mediapipe/examples/ios/facedetectioncpu:FaceDetectionCpuApp_entitlements: missing input file '//mediapipe:provisioning_profile.mobileprovision'。前面步骤生成Automatic provisioning profile有问题,改成Custom provisioning方式。
1.从苹果得到provisioning profile,打开 [Apple’s developer site](https://developer.apple.com/account/resources/),选择ProfIles,点击Profiles右边的加号,选择IOS App Development并点击continue,App ID上参考弹出列表选择XC Wildcard并记住括号的字母数字比如(3T8Q4GSCBU.*)并点击continue,勾选要开发的电脑设备并点击continue,选择要安装调试的手机设备并点击continue,输入文件名比如MyProvisioningProfile并点击generate,最后点击下载。
2.链接或拷贝provisioning profile到mediapipe/mediapipe/provisioning_profile.mobileprovision,
```
cd mediapipe
ln -s ~/Downloads/MyProvisioningProfile.mobileprovision mediapipe/provisioning_profile.mobileprovision
```
3.如果以前设置了automatic provisioning,需要去掉以前的provisioning_profile.mobileprovision链接,在mediapipe/examples/ios的每个应用目录下。但是我没发现有这个文件,应该是没有生成,所以前面设置了automatic provisioning后编译会报错
4.打开mediapipe/examples/ios/bundle_id.bzl并修改BUNDLE_ID_PREFIX,使bundle id能连接到 provisioning profile。BUNDLE_ID_PREFIX的值为前面生成provisioning profile得到的字母数字比如3T8Q4GSCBU。
5.重新使用Tulsi生成Mediapipe工程。在菜单栏Product->clean清空以前的缓存再重新选择对应target后运行安装
# 三、创建新APP并使用MediaPipe
- 创建新工程,打开 File > New > Single View App。设置 product name t为“HelloWorld”。设置 organization identifier比如 `com.google.mediapipe`,因此 `bundle_id` 应该对应为 `com.google.mediapipe.HelloWorld`。设置language 为Objective-C。把工程保存到合适的路径,把路径叫做 `$PROJECT_TEMPLATE_LOC`,因此工程创建后的目录是`$PROJECT_TEMPLATE_LOC/HelloWorld` ,这个目录包含一个文件夹名叫 `HelloWorld` 和一个 `HelloWorld.xcodeproj` 文件。
- HelloWorld.xcodeproj在本教程中将不会有用,因为我们将使用bazel来构建iOS应用程序。$ PROJECT_TEMPLATE_LOC / HelloWorld / HelloWorld目录的内容如下:1.AppDelegate.h 和 AppDelegate.m;2.ViewController.h 和 ViewController.m;3.main.m;4.Info.plist;5.Base.lproj文件夹,里面包含Main.storyboard 和 Launch.storyboard;6.Assets.xcassets文件夹;7.SceneDelegate.h 和 SceneDelegate.m(新版本Xcode添加的)。将这些文件复制到名为HelloWorld的目录中,该目录可以访问MediaPipe源代码。例如,我们把这些文件拷贝到mediapipe / examples / ios / HelloWorld中,将此路径称为$ APPLICATION_PATH。其实就是把$ PROJECT_TEMPLATE_LOC / HelloWorld / HelloWorld拷贝到mediapipe / examples / ios / HelloWorld,但是发现mediapipe / examples / ios已经有helloworld目录,删除它(因为mac文件夹命名不区分大小写),当然如果工程名不是HelloWorld就不用理会。
- 在$ APPLICATION_PATH中创建一个BUILD文件,并添加以下构建规则:objc_library规则添加依赖项AppDelegate、ViewController类和SceneDelegate类,main.m和应用storyboards(Xcode11新增了SceneDelegate,把它也加进构建规则)。模板化的应用程序仅取决于UIKit SDK,所以添加UIKit库。ios_application规则使用生成的HelloWorldAppLibrary Objective-C库,来构建用于在iOS设备上安装的iOS应用程序。provisioning_profile设置的是provisioning profile的路径,我用的是前面创建的//mediapipe:provisioning_profile.mobileprovision。不知道是否需要根据bundle_id重新创建provisioning profile。
```
MIN_IOS_VERSION = "10.0"
load(
"@build_bazel_rules_apple//apple:ios.bzl",
"ios_application",
)
ios_application(
name = "HelloWorldApp",
bundle_id = "com.google.mediapipe.HelloWorld",
families = [
"iphone",
"ipad",
],
infoplists = ["Info.plist"],
minimum_os_version = MIN_IOS_VERSION,
provisioning_profile = "//mediapipe:provisioning_profile.mobileprovision",
deps = [":HelloWorldAppLibrary"],
)
objc_library(
name = "HelloWorldAppLibrary",
srcs = [
"AppDelegate.m",
"SceneDelegate.m",
"ViewController.m",
"main.m",
],
hdrs = [
"AppDelegate.h",
"SceneDelegate.h",
"ViewController.h",
],
data = [
"Base.lproj/LaunchScreen.storyboard",
"Base.lproj/Main.storyboard",
],
sdk_frameworks = [
"UIKit",
],
deps = [],
)
```
- 在命令行运行bazel编译app,命令为bazel build -c opt --config=ios_arm64 <$APPLICATION_PATH>:HelloWorldApp',<$APPLICATION_PATH>根据前面记录进行替换,HelloWorldApp是BUILD文件的ios_application的name。编译成功后会生成ipa,把ipa拷贝出来备用。
```
bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/helloworld:HelloWorldApp
#拷贝ipa
cp bazel-bin/mediapipe/examples/ios/helloworld/HelloWorldApp.ipa mediapipe/examples/ios/helloworld/
```
编译报错ERROR: In target "//mediapipe/examples/ios/helloworld:HelloWorldApp"; unknown variable reference "$(PRODUCT_BUNDLE_PACKAGE_TYPE)" while merging plists (key: "CFBundlePackageType", value: "$(PRODUCT_BUNDLE_PACKAGE_TYPE)")。查了下,应该是xcode11或12新增的内容,打开info.plist,把对应内容删除掉。
```
#把这两行删除
CFBundlePackageType
$(PRODUCT_BUNDLE_PACKAGE_TYPE)
```
- 打开xcode,使用快捷键(command-shift-2)打开Devices and Simulators。确认手机设备已经连接,可以看到列表有很多已经安装的app,点击最下方的加号,选择前面编译得到的ipa文件进行安装。安装成功后,可以看见手机界面有一个叫HelloWorldApp的应用,点击打开是一片空白。
- 要开始添加新代码,让app使用mediapipe进行工作。教程提供的是使用mediapipe进行边缘检测。因为有涉及到storyboard的修改,所以最好是在以前的工程上修改好代码和视图,然后拷贝到编译目录。首先修改info.plist,添加摄像头权限。
```
NSCameraUsageDescription
App需要您的同意,才能访问相机
#如果用xcode打开,权限如下
Privacy - Camera Usage Description
String
App需要您的同意,才能访问相机
```
- mediapipe提供了MPPCameraInputSource类来访问和获取相机中的视频帧,类里面主要使用了AVCaptureSession类。打开ViewController.m添加代码。
```
#import "ViewController.h"
#import "mediapipe/objc/MPPCameraInputSource.h"
#import "mediapipe/objc/MPPLayerRenderer.h"
static const char* kVideoQueueLabel = "com.google.mediapipe.example.videoQueue";
@interface ViewController ()
@end
@implementation ViewController{
// Handles camera access via AVCaptureSession library.
MPPCameraInputSource* _cameraSource;
// Process camera frames on this queue.
dispatch_queue_t _videoQueue;
// Display the camera preview frames.
__weak IBOutlet UIView *_liveView;
// Render frames in a layer.
MPPLayerRenderer* _renderer;
}
/**
该代码初始化_cameraSource,设置捕获会话预设以及要使用的摄像机。我们需要将帧从_cameraSource获取到并在ViewController中显示它们。
MPPCameraInputSource是MPPInputSource的子类,它为其委托人提供协议,即MPPInputSourceDelegate。ViewController实现即MPPInputSourceDelegate,设置为_cameraSource的委托。
我们需要定义一个queue用于给相机处理输入的帧,变量名叫_videoQueue。我们将使用优先级为QOS_CLASS_USER_INTERACTIVE的串行队列来处理相机的帧
*/
- (void)viewDidLoad {
[super viewDidLoad];
_cameraSource = [[MPPCameraInputSource alloc] init];
_cameraSource.sessionPreset = AVCaptureSessionPresetHigh;
_cameraSource.cameraPosition = AVCaptureDevicePositionBack;
// The frame's native format is rotated with respect to the portrait orientation.
_cameraSource.orientation = AVCaptureVideoOrientationPortrait;
[_cameraSource setDelegate:self queue:_videoQueue];
dispatch_queue_attr_t qosAttribute = dispatch_queue_attr_make_with_qos_class(
DISPATCH_QUEUE_SERIAL, QOS_CLASS_USER_INTERACTIVE, /*relative_priority=*/0);
_videoQueue = dispatch_queue_create(kVideoQueueLabel, qosAttribute);
_renderer = [[MPPLayerRenderer alloc] init];
_renderer.layer.frame = _liveView.layer.bounds;
[_liveView.layer addSublayer:_renderer.layer];
_renderer.frameScaleMode = MPPFrameScaleModeFillAndCrop;
}
/**
MediaPipe提供了另一个名为MPPLayerRenderer的类,可在屏幕上显示图像。此实用程序可用于显示CVPixelBufferRef对象,这是MPPCameraInputSource提供给其委托的图像的类型
要显示屏幕图像,我们需要向ViewController添加一个名为_liveView的新UIView对象。
转到Main.storyboard,将对象库中的UIView对象添加到ViewController类的View。从此视图向刚刚添加到ViewController类的_liveView对象添加引用。调整视图的大小,使其居中并覆盖整个应用程序屏幕。
返回ViewController.m,在viewDidLoad()添加初始化_renderer对象代码,_liveView添加_renderer的layer来显示。
实现MPPInputSourceDelegate协议的类,在viewDidLoad设置好_cameraSource的代理,实现processVideoFrame方法。从_cameraSource获取到帧数据,通过_renderer的方法renderPixelBuffer来显示接收到帧数据
*/
// Must be invoked on _videoQueue.
- (void)processVideoFrame:(CVPixelBufferRef)imageBuffer
timestamp:(CMTime)timestamp
fromSource:(MPPInputSource*)source {
if (source != _cameraSource) {
NSLog(@"Unknown source: %@", source);
return;
}
// Display the captured image on the screen.
CFRetain(imageBuffer);
dispatch_async(dispatch_get_main_queue(), ^{
[_renderer renderPixelBuffer:imageBuffer];
CFRelease(imageBuffer);
});
}
/**
在视图出现时立即启动相机。由于打开相机需要权限,MPPCameraInputSource提供了一个requestCameraAccessWithCompletionHandler函数来处理申请权限的结果,所以需要在用户响应后再启动相机。
*/
-(void)viewWillAppear:(BOOL)animated {
[super viewWillAppear:animated];
[_cameraSource requestCameraAccessWithCompletionHandler:^void(BOOL granted) {
if (granted) {
dispatch_async(_videoQueue, ^{
[_cameraSource start];
});
}
}];
}
@end
```
- 重新修改BUILD文件,因为使用了mediapipe的方法,所以需要导入mediapipe的库,添加下面代码
```
sdk_frameworks = [
"AVFoundation",
"CoreGraphics",
"CoreMedia",
],
deps = [
"//mediapipe/objc:mediapipe_framework_ios",
"//mediapipe/objc:mediapipe_input_sources_ios",
"//mediapipe/objc:mediapipe_layer_renderer",
],
```
- 重新编译并安装ipa到手机设备,但是发现屏幕还是白色的,没有显示摄像头的图像。找了好久才发现是[cameraSource setDelegate:self queue:videoQueue]调用时videoQueue还没初始化,所以videoQueue初始化放在cameraSource初始化之前。app工作正常的现象是手机能显示后置摄像头的内容,所以前面添加的代码主要是获取摄像头帧数据并显示到屏幕上。
- 能获取摄像头图像数据后,要开始使用mediapipe处理图像。关于mediapipe图像处理,需要用到graph和其它相关联的其他资源文件。首先打开BUILD文件,添加代码,主要是添加graph和calculators,把ViewController.m改成ViewController.mm。
```
data = [
"//mediapipe/graphs/edge_detection:mobile_gpu_binary_graph",
],
deps = [
"//mediapipe/graphs/edge_detection:mobile_calculators",
],
srcs = [
"ViewController.mm",
],
```
- 添加头文件MPPGraph
```
#import "mediapipe/objc/MPPGraph.h"
```
- 把ViewController.m文件名改成ViewController.mm,并打开进行编辑。声明静态变量,分别是graph的名称、输入流和输出流
```
static NSString* const kGraphName = @"mobile_gpu";
static const char* kInputStream = "input_video";
static const char* kOutputStream = "output_video";
```
- 在ViewController的interface添加属性mediapipeGraph
```
// The MediaPipe graph currently in use. Initialized in viewDidLoad, started in viewWillAppear: and
// sent video frames on _videoQueue.
@property(nonatomic) MPPGraph* mediapipeGraph;
```
- 我们将首先在viewDidLoad中初始化graph,使用下面方法从.pbtxt文件中加载graph
```
+ (MPPGraph*)loadGraphFromResource:(NSString*)resource {
// Load the graph config resource.
NSError* configLoadError = nil;
NSBundle* bundle = [NSBundle bundleForClass:[self class]];
if (!resource || resource.length == 0) {
return nil;
}
NSURL* graphURL = [bundle URLForResource:resource withExtension:@"binarypb"];
NSData* data = [NSData dataWithContentsOfURL:graphURL options:0 error:&configLoadError];
if (!data) {
NSLog(@"Failed to load MediaPipe graph config: %@", configLoadError);
return nil;
}
// Parse the graph config resource into mediapipe::CalculatorGraphConfig proto object.
mediapipe::CalculatorGraphConfig config;
config.ParseFromArray(data.bytes, data.length);
// Create MediaPipe graph with mediapipe::CalculatorGraphConfig proto object.
MPPGraph* newGraph = [[MPPGraph alloc] initWithGraphConfig:config];
[newGraph addFrameOutputStream:kOutputStream outputPacketType:MPPPacketTypePixelBuffer];
return newGraph;
}
```
- 在viewDidLoad中使用此函数初始化graph
```
self.mediapipeGraph = [[self class] loadGraphFromResource:kGraphName];
```
- 为了让graph将处理相机帧的结果发送回ViewController。初始化graph后,将ViewController设置为mediapipeGraph对象的委托.(注意ViewController要使用MPPGraphDelegate代理)
```
self.mediapipeGraph.delegate = self;
```
- 为了避免在处理实时视频源中的帧时出现内存文件,请添加限制
```
// Set maxFramesInFlight to a small value to avoid memory contention for real-time processing.
self.mediapipeGraph.maxFramesInFlight = 2;
```
- 当用户在我们的应用中授予使用相机的权限时,启动graph。注意:重要的是在启动相机之前先启动graph并等待完成,这样graph就可以在相机开始发送帧后立即对其进行处理
```
[_cameraSource requestCameraAccessWithCompletionHandler:^void(BOOL granted) {
if (granted) {
// Start running self.mediapipeGraph.
NSError* error;
if (![self.mediapipeGraph startWithError:&error]) {
NSLog(@"Failed to start graph: %@", error);
}
else if (![self.mediapipeGraph waitUntilIdleWithError:&error]) {
NSLog(@"Failed to complete graph initial run: %@", error);
}
dispatch_async(_videoQueue, ^{
[_cameraSource start];
});
}
}];
```
- 之前,当我们在processVideoFrame函数中从摄像机接收帧时,我们使用renderer将其显示在liveView中。现在,我们需要将这些框架发送到图形并渲染结果。我们将图像缓冲区作为MPPPacketTypePixelBuffer类型的数据包发送到self.mediapipeGraph的输入流kInputStream中,即“ input_video”。
```
- (void)processVideoFrame:(CVPixelBufferRef)imageBuffer
timestamp:(CMTime)timestamp
fromSource:(MPPInputSource*)source {
if (source != _cameraSource) {
NSLog(@"Unknown source: %@", source);
return;
}
[self.mediapipeGraph sendPixelBuffer:imageBuffer
intoStream:kInputStream
packetType:MPPPacketTypePixelBuffer];
}
```
- graph接收输入数据包并允许,并在kOutputStream中输出结果,即“ output_video”。我们可以实现以下委托方法,以在此输出流上接收数据包并将其显示在屏幕上。
```
- (void)mediapipeGraph:(MPPGraph*)graph
didOutputPixelBuffer:(CVPixelBufferRef)pixelBuffer
fromStream:(const std::string&)streamName {
if (streamName == kOutputStream) {
// Display the captured image on the screen.
CVPixelBufferRetain(pixelBuffer);
dispatch_async(dispatch_get_main_queue(), ^{
[_renderer renderPixelBuffer:pixelBuffer];
CVPixelBufferRelease(pixelBuffer);
});
}
}
```
- 重新编译并安装测试。打开app,可以看到屏幕显示的是经过边缘检测处理的后置摄像头图像。
# 四、其它工程导入MediaPipe
- 前面我们学会了怎样使用bazel来编译ios工程和使用mediapipe,但是如果以前有一个比较大的工程,怎么接入mediapipe呢?我有几种想法。1.使用bazel来管理编译工程,把以前的工程改成bazel来编译,参考前面教程可以很容易使用mediapipe库,但是无法使用xcode进行调试。2.还是使用bazel来管理编译工程,但是使用Tulsi生成xcode工程来调试安装。3.使用Tulsi生成的xcode工程来调试安装,但是是否可以在xcode上用其它工程导入包含mediapipe的工程进行使用。4.是否可以使用bazel生成库,把使用mediapipe的类和资源打包成库,提供API,然后其它工程可以直接在xcode上加入库进行使用。
- 我觉得方法4比较简单,所以使用前面的HelloWorld工程来编译库,直接用build命令
```
bazel build mediapipe/examples/ios/helloworld:HelloWorldAppLibrary
```
报错error executing command external/local_config_cc/wrapped_clang -arch x86_64 '-stdlib=libc++' '-std=gnu++11' '-D_FORTIFY_SOURCE=1' -fstack-protector -fcolor-diagnostics -Wall -Wthread-safety -Wself-assign ... (remaining 79 argument(s) skipped)和error: unknown type name 'EAGLContext'。应该要加上架构配置
```
bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/helloworld:HelloWorldAppLibrary
```
- 如果成功编译库,最后有显示库的路径。把它拷贝出来备用
```
cp bazel-bin/mediapipe/examples/ios/helloworld/libHelloWorldAppLibrary.a mediapipe/examples/ios/helloworld/
```
- Xcode新建工程,参考以前新建HelloWorld工程,比如叫TestHelloWorldAppLibrary。因为我们编译的库已经包含有AppDelegate等文件,所以把重复的文件删除。删除的文件:AppDelegate.h和AppDelegate.m,SceneDelegate.h和SceneDelegate.m,ViewController.h和ViewController.m,Main.storyboard和LaunchScreen.storyboard,main.m。
- 从mediapipe/examples/ios/helloworld目录拷贝AppDelegate.h,libHelloWorldAppLibrary.a,SceneDelegate.h,ViewController.h到TestHelloWorldAppLibrary工程目录下。Xcode打开TestHelloWorldAppLibrary工程,右键点击TestHelloWorldAppLibrary目录,点击Add files to "TestHelloWorldAppLibrary",选择刚才拷贝的文件进行添加。
- 修改info.plist,添加相机权限
```
NSCameraUsageDescription
App需要您的同意,才能访问相机
#如果用xcode打开,权限如下
Privacy - Camera Usage Description
String
App需要您的同意,才能访问相机
```
- 左侧点击“TestHelloWorldAppLibrary”工程,选择 target "TestHelloWorldAppLibrary"。选择“Signing & Capabilities” 选项,选中“Automatically manage signing”,弹窗提示时选择确认,Team选项选择好个人账户。选择“General” 选项,Deloyment Info选项选择IOS10.0。
- 运行安装报错The file “TestHelloWorldAppLibrary” couldn’t be opened because you don’t have permission to view it。网上查找的方法没用,最后想了下,应该是main.m不能访问原因。和其它文件类似,把HelloWorld工程的main.m拷贝到TestHelloWorldAppLibrary工程并添加到工程上。
- 报错ld: '/Users/leesonzhong/Desktop/build_mediapipe/TestHelloWorldAppLibrary/TestHelloWorldAppLibrary/libHelloWorldAppLibrary.a(AppDelegate.o)' does not contain bitcode. You must rebuild it with bitcode enabled (Xcode setting ENABLE_BITCODE), obtain an updated library from the vendor, or disable bitcode for this target. for architecture arm64。选择target “TestHelloWorldAppLibrary”,选择“Build Settings”选项,选择“All”列表,搜索“ENABLE_BITCODE”,Enable Bitcode选择“No”.
- 编译安装成功后运行卡住,报错'Could not find a storyboard named 'Main' in bundle NSBundle。和其它文件类似,把HelloWorld工程的Main.storyboard和LaunchScreen.storyboard拷贝到TestHelloWorldAppLibrary工程并添加到工程上。
- 运行还是报错UIWindowSceneSessionRoleApplication contained UISceneDelegateClassName key, but could not load class with name "SceneDelegate"和Unknown class ViewController in Interface Builder file。看来不能直接用原来HelloWorld的代码和配置。
- 在原来HelloWorld的代码上和配置上进行修改,把mediapipe的操作进行封装,提供api进行调用,里面不要包含屏幕显示的操作。关于编辑代码,我们可以把HelloWorld.xcodeproj拷贝到mediapipe/examples/ios/后打开,编辑的文件是mediapipe/examples/ios/HelloWorld下的。还有为了方便使用mediapipe的方法,可以在mediapipe/examples/ios/HelloWorld目录下新建mediapipe文件夹,把mediapipe/objc目录拷贝到mediapipe/examples/ios/HelloWorld/mediapipe目录下,xcode添加mediapipe/objc代码,这样xcode工程上使用mediapipe的一些类和方法比较方便。
- HelloWorld工程,File>Cocoa Touch Class>Class:MediapipeEdgeDetection Subclass of:NSObject Language:Objective-C,应该会生成MediapipeEdgeDetection.h和MediapipeEdgeDetection.m两个文件,把MediapipeEdgeDetection.m改成MediapipeEdgeDetection.mm。
- MediapipeEdgeDetection.h文件内容
```
#import
#import
NS_ASSUME_NONNULL_BEGIN
@interface MediapipeEdgeDetection : NSObject
-(void)initCamera;
-(void)initRender:(UIView *)liveView;
-(void)initGraph;
-(void)startCameraAndGraph;
@end
NS_ASSUME_NONNULL_END
```
- MediapipeEdgeDetection.mm文件内容
```
#import "MediapipeEdgeDetection.h"
#import "mediapipe/objc/MPPCameraInputSource.h"
#import "mediapipe/objc/MPPLayerRenderer.h"
#import "mediapipe/objc/MPPGraph.h"
static const char* kVideoQueueLabel = "com.google.mediapipe.example.videoQueue";
static NSString* const kGraphName = @"mobile_gpu";
static const char* kInputStream = "input_video";
static const char* kOutputStream = "output_video";
@interface MediapipeEdgeDetection()
@property(nonatomic) MPPGraph* mediapipeGraph;
@end
@implementation MediapipeEdgeDetection{
// Handles camera access via AVCaptureSession library.
MPPCameraInputSource* _cameraSource;
// Process camera frames on this queue.
dispatch_queue_t _videoQueue;
// Render frames in a layer.
MPPLayerRenderer* _renderer;
}
-(void)initCamera{
dispatch_queue_attr_t qosAttribute = dispatch_queue_attr_make_with_qos_class(
DISPATCH_QUEUE_SERIAL, QOS_CLASS_USER_INTERACTIVE, /*relative_priority=*/0);
_videoQueue = dispatch_queue_create(kVideoQueueLabel, qosAttribute);
_cameraSource = [[MPPCameraInputSource alloc] init];
_cameraSource.sessionPreset = AVCaptureSessionPresetHigh;
_cameraSource.cameraPosition = AVCaptureDevicePositionBack;
// The frame's native format is rotated with respect to the portrait orientation.
_cameraSource.orientation = AVCaptureVideoOrientationPortrait;
[_cameraSource setDelegate:self queue:_videoQueue];
}
-(void)initRender:(UIView *)liveView{
_renderer = [[MPPLayerRenderer alloc] init];
_renderer.layer.frame = liveView.layer.bounds;
[liveView.layer addSublayer:_renderer.layer];
_renderer.frameScaleMode = MPPFrameScaleModeFillAndCrop;
}
-(void)initGraph{
self.mediapipeGraph = [[self class] loadGraphFromResource:kGraphName];
self.mediapipeGraph.delegate = self;
// Set maxFramesInFlight to a small value to avoid memory contention for real-time processing.
self.mediapipeGraph.maxFramesInFlight = 2;
}
-(void)startCameraAndGraph{
[_cameraSource requestCameraAccessWithCompletionHandler:^void(BOOL granted) {
if (granted) {
// Start running self.mediapipeGraph.
NSError* error;
if (![self.mediapipeGraph startWithError:&error]) {
NSLog(@"Failed to start graph: %@", error);
}
else if (![self.mediapipeGraph waitUntilIdleWithError:&error]) {
NSLog(@"Failed to complete graph initial run: %@", error);
}
dispatch_async(self->_videoQueue, ^{
[self->_cameraSource start];
});
}
}];
}
//--------------------MPPInputSourceDelegate--------------------
// Must be invoked on _videoQueue.
- (void)processVideoFrame:(CVPixelBufferRef)imageBuffer
timestamp:(CMTime)timestamp
fromSource:(MPPInputSource*)source {
if (source != _cameraSource) {
NSLog(@"Unknown source: %@", source);
return;
}
if (self.mediapipeGraph) {
[self.mediapipeGraph sendPixelBuffer:imageBuffer
intoStream:kInputStream
packetType:MPPPacketTypePixelBuffer];
}
}
//--------------------MPPGraphDelegate--------------------
- (void)mediapipeGraph:(MPPGraph*)graph
didOutputPixelBuffer:(CVPixelBufferRef)pixelBuffer
fromStream:(const std::string&)streamName {
if (streamName == kOutputStream) {
// Display the captured image on the screen.
CVPixelBufferRetain(pixelBuffer);
if (self->_renderer) {
dispatch_async(dispatch_get_main_queue(), ^{
[self->_renderer renderPixelBuffer:pixelBuffer];
CVPixelBufferRelease(pixelBuffer);
});
}
}
}
+ (MPPGraph*)loadGraphFromResource:(NSString*)resource {
// Load the graph config resource.
NSError* configLoadError = nil;
NSBundle* bundle = [NSBundle bundleForClass:[self class]];
if (!resource || resource.length == 0) {
return nil;
}
NSURL* graphURL = [bundle URLForResource:resource withExtension:@"binarypb"];
NSData* data = [NSData dataWithContentsOfURL:graphURL options:0 error:&configLoadError];
if (!data) {
NSLog(@"Failed to load MediaPipe graph config: %@", configLoadError);
return nil;
}
// Parse the graph config resource into mediapipe::CalculatorGraphConfig proto object.
mediapipe::CalculatorGraphConfig config;
config.ParseFromArray(data.bytes, data.length);
// Create MediaPipe graph with mediapipe::CalculatorGraphConfig proto object.
MPPGraph* newGraph = [[MPPGraph alloc] initWithGraphConfig:config];
[newGraph addFrameOutputStream:kOutputStream outputPacketType:MPPPacketTypePixelBuffer];
return newGraph;
}
@end
```
- BUILD文件内容
```
MIN_IOS_VERSION = "10.0"
load(
"@build_bazel_rules_apple//apple:ios.bzl",
"ios_application",
)
objc_library(
name = "HelloWorldAppLibrary",
srcs = [
"MediapipeEdgeDetection.mm",
],
hdrs = [
"MediapipeEdgeDetection.h",
],
data = [
"//mediapipe/graphs/edge_detection:mobile_gpu_binary_graph",
],
sdk_frameworks = [
"UIKit",
"AVFoundation",
"CoreGraphics",
"CoreMedia",
],
deps = [
"//mediapipe/objc:mediapipe_framework_ios",
"//mediapipe/objc:mediapipe_input_sources_ios",
"//mediapipe/objc:mediapipe_layer_renderer",
"//mediapipe/graphs/edge_detection:mobile_calculators",
],
)
```
- 使用bazel重新编译库,如果需要编译armv7,加上--config=ios_armv7,但是报错 iOS 10 is the maximum deployment target for 32-bit targets,因为ios_armv7支持的IOS版本很低,所以一般不需要编译。
```
bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/helloworld:HelloWorldAppLibrary
#编译成功后拷贝出来备用
sudo cp bazel-bin/mediapipe/examples/ios/helloworld/libHelloWorldAppLibrary.a mediapipe/examples/ios/helloworld
```
- 把原来的TestHelloWorldAppLibrary工程删除,重新新建一个。拷贝libHelloWorldAppLibrary.a和MediapipeEdgeDetection.h到工程目录,并添加到工程上。
- 打开Main.storyboard,把对象库中的UIView对象添加到ViewController类的View。从此视图向ViewController类的添加引用liveView对象。调整视图的大小,使其居中并覆盖整个应用程序屏幕。
- 打开ViewController.m,调用MediapipeEdgeDetection的方法。
```
#import "ViewController.h"
#import "MediapipeEdgeDetection.h"
@interface ViewController (){
MediapipeEdgeDetection *detection;
}
@property (weak, nonatomic) IBOutlet UIView *liveView;
@end
@implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
detection = [[MediapipeEdgeDetection alloc] init];
[detection initCamera];
[detection initRender:_liveView];
[detection initGraph];
}
-(void)viewWillAppear:(BOOL)animated{
[detection startCameraAndGraph];
}
@end
```
- 打开info.plist,添加相机权限
```
NSCameraUsageDescription
App需要您的同意,才能访问相机
#如果用xcode打开,权限如下
Privacy - Camera Usage Description
String
App需要您的同意,才能访问相机
```
- 左侧点击“TestHelloWorldAppLibrary”工程,选择 target "TestHelloWorldAppLibrary"。选择“Signing & Capabilities” 选项,选中“Automatically manage signing”,弹窗提示时选择确认,Team选项选择好个人账户。选择“General” 选项,Deloyment Info选项选择IOS10.0。选择“Build Settings”选项,选择“All”列表,搜索“ENABLE_BITCODE”,Enable Bitcode选择“No”.
- xcode编译安装报错,Undefined symbols for architecture arm64,把ViewController.m改成ViewController.mm还是有报错,虽然少了一些错误。
- 网上找了下,.a和.framework的区别。https://blog.csdn.net/kokmmm33/article/details/50145883。.a是一个纯二进制文件,.framework中除了有二进制文件之外还有资源文件。.a文件不能直接使用,至少要有.h文件配合,.framework文件可以直接使用。.a + .h + sourceFile = .framework。bazel关于apple的编译规则有提供ios_framework,参考https://github.com/noppefoxwolf/HandTracker.git。
- 修改BUILD文件
```
MIN_IOS_VERSION = "10.0"
load(
"@build_bazel_rules_apple//apple:ios.bzl",
"ios_framework",
)
ios_framework(
name = "HelloWorldApp",
hdrs = [
"MediapipeEdgeDetection.h",
],
bundle_id = "com.google.mediapipe.HelloWorld",
families = [
"iphone",
"ipad",
],
infoplists = ["Info.plist"],
minimum_os_version = MIN_IOS_VERSION,
deps = [":HelloWorldAppLibrary"],
)
objc_library(
name = "HelloWorldAppLibrary",
srcs = [
"MediapipeEdgeDetection.mm",
],
hdrs = [
"MediapipeEdgeDetection.h",
],
data = [
"//mediapipe/graphs/edge_detection:mobile_gpu_binary_graph",
],
sdk_frameworks = [
"UIKit",
"AVFoundation",
"CoreGraphics",
"CoreMedia",
],
deps = [
"//mediapipe/objc:mediapipe_framework_ios",
"//mediapipe/objc:mediapipe_input_sources_ios",
"//mediapipe/objc:mediapipe_layer_renderer",
"//mediapipe/graphs/edge_detection:mobile_calculators",
],
)
```
- 编译framework库
```
bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/helloworld:HelloWorldApp
#拷贝出来备用
sudo cp bazel-bin/mediapipe/examples/ios/helloworld/HelloWorldApp.zip mediapipe/examples/ios/helloworld
```
- mediapipe/examples/ios/helloworld目录拷贝HelloWorldApp.zip到TestHelloWorldAppLibrary工程目录并加压得到HelloWorldApp.framework。xcode打开TestHelloWorldAppLibrary工程,添加HelloWorldApp.framework,删除libHelloWorldAppLibrary.a和MediapipeEdgeDetection.h。
- 打开ViewController.m文件,修改import的头文件
```
#import "MediapipeEdgeDetection.h"
改成
#import
```
- 编译安装后运行报错dyld: Library not loaded:@rpath/HelloWorldApp.framework/HelloWorldApp。选择 target "TestHelloWorldAppLibrary"。选择”Build Phases “选项,点击左上角加号(Dependencies上方),选择 New Copy Files Phase,展开 Copy Files,Destination 选择 Frameworks, 点击加号 把framework 包加入到 Name 目录下。
- 然后再次编译安装后,成功运行,有弹出相机权限确认窗口,确认后有出现边缘检测处理后的摄像头图像。