简单的测试了一下Pow和Eltwise算子,可以caffe转ms, 但在推理和量化的时候都有问题。
测试的网络结构:
name:"test_power"
layer {
name: "frame_1"
type: "Input"
top: "frame_1"
input_param {
shape {
dim: 1
dim: 3
dim: 272
dim: 480
}
}
}
layer {
name: "frame_2"
type: "Input"
top: "frame_2"
input_param {
shape {
dim: 1
dim: 3
dim: 272
dim: 480
}
}
}
layer {
name: "concatenate_1"
type: "Concat"
bottom: "frame_1"
bottom: "frame_2"
top: "concatenate_1"
concat_param {
axis: 1
}
}
layer {
name: "head_output"
type: "Convolution"
bottom: "concatenate_1"
top: "head_output"
convolution_param {
num_output: 2
bias_term: true
pad_h: 1
pad_w: 1
kernel_h: 3
kernel_w: 3
stride_h: 1
stride_w: 1
dilation: 1
weight_filler: {
type: "xavier"
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "fg_output"
type: "Convolution"
bottom: "concatenate_1"
top: "fg_output"
convolution_param {
num_output: 2
bias_term: true
pad_h: 1
pad_w: 1
kernel_h: 3
kernel_w: 3
stride_h: 1
stride_w: 1
dilation: 1
weight_filler: {
type: "xavier"
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "flow_output"
type: "Convolution"
bottom: "concatenate_1"
top: "flow_output"
convolution_param {
num_output: 2
bias_term: true
pad_h: 1
pad_w: 1
kernel_h: 3
kernel_w: 3
stride_h: 1
stride_w: 1
dilation: 1
weight_filler: {
type: "xavier"
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "head_output_reverse_layer"
bottom: "head_output"
top: "head_output_reverse"
type: "Power"
power_param {
power: 1
scale: -1
shift: 0
}
}
layer {
name: "layer_threshold_head_output"
bottom: "head_output_reverse"
top: "argmax_thr_head_output"
type: "ArgMax"
argmax_param {
axis: 1
}
}
layer {
name: "fg_output_reverse_layer"
bottom: "fg_output"
top: "fg_output_reverse"
type: "Power"
power_param {
power: 1
scale: -1
shift: 0
}
}
layer {
name: "layer_threshold_fg_output"
bottom: "fg_output_reverse"
top: "argmax_thr_fg_output"
type: "ArgMax"
argmax_param {
axis: 1
}
}
layer {
name: "flow_output_reverse_layer"
bottom: "flow_output"
top: "flow_output_reverse"
type: "Power"
power_param {
power: 1
scale: -1
shift: 0
}
}
layer {
name: "layer_threshold_flow_output"
bottom: "flow_output_reverse"
top: "argmax_thr_flow_output"
type: "ArgMax"
argmax_param {
axis: 1
}
}
layer {
name: "argmax_thr_flow_output_power_layer"
bottom: "argmax_thr_flow_output"
top: "argmax_thr_flow_output_p"
type: "Power"
power_param {
power: 1
scale: 4
shift: 0
}
}
layer {
name: "argmax_thr_head_output_power_layer"
bottom: "argmax_thr_head_output"
top: "argmax_thr_head_output_p"
type: "Power"
power_param {
power: 1
scale: 2
shift: 0
}
}
layer {
name: "res1"
type: "Eltwise"
bottom: "argmax_thr_flow_output_p"
bottom: "argmax_thr_head_output_p"
top: "res1"
eltwise_param {
operation: SUM
coeff: 1
coeff: 1
}
}
layer {
name: "final_res"
type: "Eltwise"
bottom: "res1"
bottom: "argmax_thr_fg_output"
top: "final_res"
eltwise_param {
operation: SUM
coeff: 1
coeff: 1
}
}
此外,Eltwise算子只支持两个输入和系数为1的情况
Please assign maintainer to check this issue.
请为此issue分配处理人。
@fangwenyi @chengxiaoli
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。
Please add labels (comp or sig), also you can visit https://gitee.com/mindspore/community/blob/master/sigs/dx/docs/labels.md to find more.
为了让代码尽快被审核,请您为Pull Request打上 组件(comp)或兴趣组(sig) 标签,打上标签的PR可以直接推送给责任人进行审核。
更多的标签可以查看https://gitee.com/mindspore/community/blob/master/sigs/dx/docs/labels.md
以组件相关代码提交为例,如果你提交的是data组件代码,你可以这样评论:
//comp/data
当然你也可以邀请data SIG组来审核代码,可以这样写:
//sig/data
另外你还可以给这个PR标记类型,例如是bugfix或者是特性需求:
//kind/bug or //kind/feature
恭喜你,你已经学会了使用命令来打标签,接下来就在下面的评论里打上标签吧!
你好,问题收到了,已经安排人员专门处理您的这个问题,请随时关注码云上信息
Eltwise报错情况:
name:"test_power"
layer {
name: "frame_1"
type: "Input"
top: "frame_1"
input_param {
shape {
dim: 1
dim: 3
dim: 272
dim: 480
}
}
}
layer {
name: "frame_2"
type: "Input"
top: "frame_2"
input_param {
shape {
dim: 1
dim: 3
dim: 272
dim: 480
}
}
}
layer {
name: "concatenate_1"
type: "Concat"
bottom: "frame_1"
bottom: "frame_2"
top: "concatenate_1"
concat_param {
axis: 1
}
}
layer {
name: "head_output"
type: "Convolution"
bottom: "concatenate_1"
top: "head_output"
convolution_param {
num_output: 2
bias_term: true
pad_h: 1
pad_w: 1
kernel_h: 3
kernel_w: 3
stride_h: 1
stride_w: 1
dilation: 1
weight_filler: {
type: "xavier"
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "fg_output"
type: "Convolution"
bottom: "concatenate_1"
top: "fg_output"
convolution_param {
num_output: 2
bias_term: true
pad_h: 1
pad_w: 1
kernel_h: 3
kernel_w: 3
stride_h: 1
stride_w: 1
dilation: 1
weight_filler: {
type: "xavier"
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "flow_output"
type: "Convolution"
bottom: "concatenate_1"
top: "flow_output"
convolution_param {
num_output: 2
bias_term: true
pad_h: 1
pad_w: 1
kernel_h: 3
kernel_w: 3
stride_h: 1
stride_w: 1
dilation: 1
weight_filler: {
type: "xavier"
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "head_output_reverse_layer"
bottom: "head_output"
top: "head_output_reverse"
type: "Power"
power_param {
power: 1
scale: -1
shift: 0
}
}
layer {
name: "layer_threshold_head_output"
bottom: "head_output_reverse"
top: "argmax_thr_head_output"
type: "ArgMax"
argmax_param {
axis: 1
}
}
layer {
name: "fg_output_reverse_layer"
bottom: "fg_output"
top: "fg_output_reverse"
type: "Power"
power_param {
power: 1
scale: -1
shift: 0
}
}
layer {
name: "layer_threshold_fg_output"
bottom: "fg_output_reverse"
top: "argmax_thr_fg_output"
type: "ArgMax"
argmax_param {
axis: 1
}
}
layer {
name: "flow_output_reverse_layer"
bottom: "flow_output"
top: "flow_output_reverse"
type: "Power"
power_param {
power: 1
scale: -1
shift: 0
}
}
layer {
name: "layer_threshold_flow_output"
bottom: "flow_output_reverse"
top: "argmax_thr_flow_output"
type: "ArgMax"
argmax_param {
axis: 1
}
}
layer {
name: "res1"
type: "Eltwise"
bottom: "argmax_thr_flow_output"
bottom: "argmax_thr_head_output"
top: "res1"
eltwise_param {
operation: SUM
coeff: 1
coeff: 1
}
}
layer {
name: "final_res"
type: "Eltwise"
bottom: "res1"
bottom: "argmax_thr_fg_output"
top: "final_res"
eltwise_param {
operation: SUM
coeff: 1
coeff: 1
}
}
貌似添加不了附件,我怎么发给你
邮箱在上面发一下吧
你好,问题解决了吗?如果没解决的话,可以找zhaodezan支撑下
同时由于长时间没有反馈,此ISSUE先关闭,如有问题,可以反馈下具体信息,并将ISSUE状态修改为WIP,我们这边会进一步跟踪,谢谢
登录 后才可以发表评论