AB739 commited on
Commit
14660ec
·
verified ·
1 Parent(s): 06c56bf

Upload 2 files

Browse files
Files changed (2) hide show
  1. openvino_model/model.bin +3 -0
  2. openvino_model/model.xml +2542 -0
openvino_model/model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1ea08d6ac3c5592ad9afb0bd9ff4f31f505a302003f14e9381db4fb6f68907f5
3
+ size 142068
openvino_model/model.xml ADDED
@@ -0,0 +1,2542 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <?xml version="1.0"?>
2
+ <net name="main_graph" version="11">
3
+ <layers>
4
+ <layer id="0" name="input" type="Parameter" version="opset1">
5
+ <data shape="?,1,64,481" element_type="f32" />
6
+ <rt_info>
7
+ <attribute name="old_api_map_element_type" version="0" value="f16" />
8
+ </rt_info>
9
+ <output>
10
+ <port id="0" precision="FP32" names="input">
11
+ <dim>-1</dim>
12
+ <dim>1</dim>
13
+ <dim>64</dim>
14
+ <dim>481</dim>
15
+ </port>
16
+ </output>
17
+ </layer>
18
+ <layer id="1" name="onnx::Conv_135_compressed" type="Const" version="opset1">
19
+ <data element_type="f16" shape="24, 1, 5, 5" offset="0" size="1200" />
20
+ <output>
21
+ <port id="0" precision="FP16" names="onnx::Conv_135">
22
+ <dim>24</dim>
23
+ <dim>1</dim>
24
+ <dim>5</dim>
25
+ <dim>5</dim>
26
+ </port>
27
+ </output>
28
+ </layer>
29
+ <layer id="2" name="onnx::Conv_135" type="Convert" version="opset1">
30
+ <data destination_type="f32" />
31
+ <rt_info>
32
+ <attribute name="decompression" version="0" />
33
+ </rt_info>
34
+ <input>
35
+ <port id="0" precision="FP16">
36
+ <dim>24</dim>
37
+ <dim>1</dim>
38
+ <dim>5</dim>
39
+ <dim>5</dim>
40
+ </port>
41
+ </input>
42
+ <output>
43
+ <port id="1" precision="FP32">
44
+ <dim>24</dim>
45
+ <dim>1</dim>
46
+ <dim>5</dim>
47
+ <dim>5</dim>
48
+ </port>
49
+ </output>
50
+ </layer>
51
+ <layer id="3" name="/backbone/conv1/0/Conv/WithoutBiases" type="Convolution" version="opset1">
52
+ <data strides="2, 2" dilations="1, 1" pads_begin="2, 2" pads_end="2, 2" auto_pad="explicit" />
53
+ <input>
54
+ <port id="0" precision="FP32">
55
+ <dim>-1</dim>
56
+ <dim>1</dim>
57
+ <dim>64</dim>
58
+ <dim>481</dim>
59
+ </port>
60
+ <port id="1" precision="FP32">
61
+ <dim>24</dim>
62
+ <dim>1</dim>
63
+ <dim>5</dim>
64
+ <dim>5</dim>
65
+ </port>
66
+ </input>
67
+ <output>
68
+ <port id="2" precision="FP32">
69
+ <dim>-1</dim>
70
+ <dim>24</dim>
71
+ <dim>32</dim>
72
+ <dim>241</dim>
73
+ </port>
74
+ </output>
75
+ </layer>
76
+ <layer id="4" name="Reshape_11328_compressed" type="Const" version="opset1">
77
+ <data element_type="f16" shape="1, 24, 1, 1" offset="1200" size="48" />
78
+ <output>
79
+ <port id="0" precision="FP16">
80
+ <dim>1</dim>
81
+ <dim>24</dim>
82
+ <dim>1</dim>
83
+ <dim>1</dim>
84
+ </port>
85
+ </output>
86
+ </layer>
87
+ <layer id="5" name="Reshape_11328" type="Convert" version="opset1">
88
+ <data destination_type="f32" />
89
+ <rt_info>
90
+ <attribute name="decompression" version="0" />
91
+ </rt_info>
92
+ <input>
93
+ <port id="0" precision="FP16">
94
+ <dim>1</dim>
95
+ <dim>24</dim>
96
+ <dim>1</dim>
97
+ <dim>1</dim>
98
+ </port>
99
+ </input>
100
+ <output>
101
+ <port id="1" precision="FP32">
102
+ <dim>1</dim>
103
+ <dim>24</dim>
104
+ <dim>1</dim>
105
+ <dim>1</dim>
106
+ </port>
107
+ </output>
108
+ </layer>
109
+ <layer id="6" name="/backbone/conv1/0/Conv" type="Add" version="opset1">
110
+ <data auto_broadcast="numpy" />
111
+ <input>
112
+ <port id="0" precision="FP32">
113
+ <dim>-1</dim>
114
+ <dim>24</dim>
115
+ <dim>32</dim>
116
+ <dim>241</dim>
117
+ </port>
118
+ <port id="1" precision="FP32">
119
+ <dim>1</dim>
120
+ <dim>24</dim>
121
+ <dim>1</dim>
122
+ <dim>1</dim>
123
+ </port>
124
+ </input>
125
+ <output>
126
+ <port id="2" precision="FP32" names="/backbone/conv1/0/Conv_output_0">
127
+ <dim>-1</dim>
128
+ <dim>24</dim>
129
+ <dim>32</dim>
130
+ <dim>241</dim>
131
+ </port>
132
+ </output>
133
+ </layer>
134
+ <layer id="7" name="/backbone/conv1/2/Relu" type="ReLU" version="opset1">
135
+ <input>
136
+ <port id="0" precision="FP32">
137
+ <dim>-1</dim>
138
+ <dim>24</dim>
139
+ <dim>32</dim>
140
+ <dim>241</dim>
141
+ </port>
142
+ </input>
143
+ <output>
144
+ <port id="1" precision="FP32" names="/backbone/conv1/2/Relu_output_0">
145
+ <dim>-1</dim>
146
+ <dim>24</dim>
147
+ <dim>32</dim>
148
+ <dim>241</dim>
149
+ </port>
150
+ </output>
151
+ </layer>
152
+ <layer id="8" name="Reshape_11341_compressed" type="Const" version="opset1">
153
+ <data element_type="f16" shape="24, 1, 1, 5, 5" offset="1248" size="1200" />
154
+ <output>
155
+ <port id="0" precision="FP16">
156
+ <dim>24</dim>
157
+ <dim>1</dim>
158
+ <dim>1</dim>
159
+ <dim>5</dim>
160
+ <dim>5</dim>
161
+ </port>
162
+ </output>
163
+ </layer>
164
+ <layer id="9" name="Reshape_11341" type="Convert" version="opset1">
165
+ <data destination_type="f32" />
166
+ <rt_info>
167
+ <attribute name="decompression" version="0" />
168
+ </rt_info>
169
+ <input>
170
+ <port id="0" precision="FP16">
171
+ <dim>24</dim>
172
+ <dim>1</dim>
173
+ <dim>1</dim>
174
+ <dim>5</dim>
175
+ <dim>5</dim>
176
+ </port>
177
+ </input>
178
+ <output>
179
+ <port id="1" precision="FP32">
180
+ <dim>24</dim>
181
+ <dim>1</dim>
182
+ <dim>1</dim>
183
+ <dim>5</dim>
184
+ <dim>5</dim>
185
+ </port>
186
+ </output>
187
+ </layer>
188
+ <layer id="10" name="/backbone/single_blocks.0/0/Conv" type="GroupConvolution" version="opset1">
189
+ <data strides="1, 1" pads_begin="2, 2" pads_end="2, 2" dilations="1, 1" auto_pad="explicit" />
190
+ <input>
191
+ <port id="0" precision="FP32">
192
+ <dim>-1</dim>
193
+ <dim>24</dim>
194
+ <dim>32</dim>
195
+ <dim>241</dim>
196
+ </port>
197
+ <port id="1" precision="FP32">
198
+ <dim>24</dim>
199
+ <dim>1</dim>
200
+ <dim>1</dim>
201
+ <dim>5</dim>
202
+ <dim>5</dim>
203
+ </port>
204
+ </input>
205
+ <output>
206
+ <port id="2" precision="FP32" names="/backbone/single_blocks.0/0/Conv_output_0">
207
+ <dim>-1</dim>
208
+ <dim>24</dim>
209
+ <dim>32</dim>
210
+ <dim>241</dim>
211
+ </port>
212
+ </output>
213
+ </layer>
214
+ <layer id="11" name="onnx::Conv_138_compressed" type="Const" version="opset1">
215
+ <data element_type="f16" shape="24, 24, 1, 1" offset="2448" size="1152" />
216
+ <output>
217
+ <port id="0" precision="FP16" names="onnx::Conv_138">
218
+ <dim>24</dim>
219
+ <dim>24</dim>
220
+ <dim>1</dim>
221
+ <dim>1</dim>
222
+ </port>
223
+ </output>
224
+ </layer>
225
+ <layer id="12" name="onnx::Conv_138" type="Convert" version="opset1">
226
+ <data destination_type="f32" />
227
+ <rt_info>
228
+ <attribute name="decompression" version="0" />
229
+ </rt_info>
230
+ <input>
231
+ <port id="0" precision="FP16">
232
+ <dim>24</dim>
233
+ <dim>24</dim>
234
+ <dim>1</dim>
235
+ <dim>1</dim>
236
+ </port>
237
+ </input>
238
+ <output>
239
+ <port id="1" precision="FP32">
240
+ <dim>24</dim>
241
+ <dim>24</dim>
242
+ <dim>1</dim>
243
+ <dim>1</dim>
244
+ </port>
245
+ </output>
246
+ </layer>
247
+ <layer id="13" name="/backbone/single_blocks.0/1/Conv/WithoutBiases" type="Convolution" version="opset1">
248
+ <data strides="1, 1" dilations="1, 1" pads_begin="0, 0" pads_end="0, 0" auto_pad="explicit" />
249
+ <input>
250
+ <port id="0" precision="FP32">
251
+ <dim>-1</dim>
252
+ <dim>24</dim>
253
+ <dim>32</dim>
254
+ <dim>241</dim>
255
+ </port>
256
+ <port id="1" precision="FP32">
257
+ <dim>24</dim>
258
+ <dim>24</dim>
259
+ <dim>1</dim>
260
+ <dim>1</dim>
261
+ </port>
262
+ </input>
263
+ <output>
264
+ <port id="2" precision="FP32">
265
+ <dim>-1</dim>
266
+ <dim>24</dim>
267
+ <dim>32</dim>
268
+ <dim>241</dim>
269
+ </port>
270
+ </output>
271
+ </layer>
272
+ <layer id="14" name="Reshape_11394_compressed" type="Const" version="opset1">
273
+ <data element_type="f16" shape="1, 24, 1, 1" offset="3600" size="48" />
274
+ <output>
275
+ <port id="0" precision="FP16">
276
+ <dim>1</dim>
277
+ <dim>24</dim>
278
+ <dim>1</dim>
279
+ <dim>1</dim>
280
+ </port>
281
+ </output>
282
+ </layer>
283
+ <layer id="15" name="Reshape_11394" type="Convert" version="opset1">
284
+ <data destination_type="f32" />
285
+ <rt_info>
286
+ <attribute name="decompression" version="0" />
287
+ </rt_info>
288
+ <input>
289
+ <port id="0" precision="FP16">
290
+ <dim>1</dim>
291
+ <dim>24</dim>
292
+ <dim>1</dim>
293
+ <dim>1</dim>
294
+ </port>
295
+ </input>
296
+ <output>
297
+ <port id="1" precision="FP32">
298
+ <dim>1</dim>
299
+ <dim>24</dim>
300
+ <dim>1</dim>
301
+ <dim>1</dim>
302
+ </port>
303
+ </output>
304
+ </layer>
305
+ <layer id="16" name="/backbone/single_blocks.0/1/Conv" type="Add" version="opset1">
306
+ <data auto_broadcast="numpy" />
307
+ <input>
308
+ <port id="0" precision="FP32">
309
+ <dim>-1</dim>
310
+ <dim>24</dim>
311
+ <dim>32</dim>
312
+ <dim>241</dim>
313
+ </port>
314
+ <port id="1" precision="FP32">
315
+ <dim>1</dim>
316
+ <dim>24</dim>
317
+ <dim>1</dim>
318
+ <dim>1</dim>
319
+ </port>
320
+ </input>
321
+ <output>
322
+ <port id="2" precision="FP32" names="/backbone/single_blocks.0/1/Conv_output_0">
323
+ <dim>-1</dim>
324
+ <dim>24</dim>
325
+ <dim>32</dim>
326
+ <dim>241</dim>
327
+ </port>
328
+ </output>
329
+ </layer>
330
+ <layer id="17" name="/backbone/single_blocks.0/3/Relu" type="ReLU" version="opset1">
331
+ <input>
332
+ <port id="0" precision="FP32">
333
+ <dim>-1</dim>
334
+ <dim>24</dim>
335
+ <dim>32</dim>
336
+ <dim>241</dim>
337
+ </port>
338
+ </input>
339
+ <output>
340
+ <port id="1" precision="FP32" names="/backbone/single_blocks.0/3/Relu_output_0">
341
+ <dim>-1</dim>
342
+ <dim>24</dim>
343
+ <dim>32</dim>
344
+ <dim>241</dim>
345
+ </port>
346
+ </output>
347
+ </layer>
348
+ <layer id="18" name="Reshape_11407_compressed" type="Const" version="opset1">
349
+ <data element_type="f16" shape="24, 1, 1, 5, 5" offset="3648" size="1200" />
350
+ <output>
351
+ <port id="0" precision="FP16">
352
+ <dim>24</dim>
353
+ <dim>1</dim>
354
+ <dim>1</dim>
355
+ <dim>5</dim>
356
+ <dim>5</dim>
357
+ </port>
358
+ </output>
359
+ </layer>
360
+ <layer id="19" name="Reshape_11407" type="Convert" version="opset1">
361
+ <data destination_type="f32" />
362
+ <rt_info>
363
+ <attribute name="decompression" version="0" />
364
+ </rt_info>
365
+ <input>
366
+ <port id="0" precision="FP16">
367
+ <dim>24</dim>
368
+ <dim>1</dim>
369
+ <dim>1</dim>
370
+ <dim>5</dim>
371
+ <dim>5</dim>
372
+ </port>
373
+ </input>
374
+ <output>
375
+ <port id="1" precision="FP32">
376
+ <dim>24</dim>
377
+ <dim>1</dim>
378
+ <dim>1</dim>
379
+ <dim>5</dim>
380
+ <dim>5</dim>
381
+ </port>
382
+ </output>
383
+ </layer>
384
+ <layer id="20" name="/backbone/single_blocks.1/Conv" type="GroupConvolution" version="opset1">
385
+ <data strides="1, 1" pads_begin="2, 2" pads_end="2, 2" dilations="1, 1" auto_pad="explicit" />
386
+ <input>
387
+ <port id="0" precision="FP32">
388
+ <dim>-1</dim>
389
+ <dim>24</dim>
390
+ <dim>32</dim>
391
+ <dim>241</dim>
392
+ </port>
393
+ <port id="1" precision="FP32">
394
+ <dim>24</dim>
395
+ <dim>1</dim>
396
+ <dim>1</dim>
397
+ <dim>5</dim>
398
+ <dim>5</dim>
399
+ </port>
400
+ </input>
401
+ <output>
402
+ <port id="2" precision="FP32" names="/backbone/single_blocks.1/Conv_output_0">
403
+ <dim>-1</dim>
404
+ <dim>24</dim>
405
+ <dim>32</dim>
406
+ <dim>241</dim>
407
+ </port>
408
+ </output>
409
+ </layer>
410
+ <layer id="21" name="onnx::Conv_141_compressed" type="Const" version="opset1">
411
+ <data element_type="f16" shape="24, 24, 1, 1" offset="4848" size="1152" />
412
+ <output>
413
+ <port id="0" precision="FP16" names="onnx::Conv_141">
414
+ <dim>24</dim>
415
+ <dim>24</dim>
416
+ <dim>1</dim>
417
+ <dim>1</dim>
418
+ </port>
419
+ </output>
420
+ </layer>
421
+ <layer id="22" name="onnx::Conv_141" type="Convert" version="opset1">
422
+ <data destination_type="f32" />
423
+ <rt_info>
424
+ <attribute name="decompression" version="0" />
425
+ </rt_info>
426
+ <input>
427
+ <port id="0" precision="FP16">
428
+ <dim>24</dim>
429
+ <dim>24</dim>
430
+ <dim>1</dim>
431
+ <dim>1</dim>
432
+ </port>
433
+ </input>
434
+ <output>
435
+ <port id="1" precision="FP32">
436
+ <dim>24</dim>
437
+ <dim>24</dim>
438
+ <dim>1</dim>
439
+ <dim>1</dim>
440
+ </port>
441
+ </output>
442
+ </layer>
443
+ <layer id="23" name="/backbone/single_blocks.1/Conv_1/WithoutBiases" type="Convolution" version="opset1">
444
+ <data strides="1, 1" dilations="1, 1" pads_begin="0, 0" pads_end="0, 0" auto_pad="explicit" />
445
+ <input>
446
+ <port id="0" precision="FP32">
447
+ <dim>-1</dim>
448
+ <dim>24</dim>
449
+ <dim>32</dim>
450
+ <dim>241</dim>
451
+ </port>
452
+ <port id="1" precision="FP32">
453
+ <dim>24</dim>
454
+ <dim>24</dim>
455
+ <dim>1</dim>
456
+ <dim>1</dim>
457
+ </port>
458
+ </input>
459
+ <output>
460
+ <port id="2" precision="FP32">
461
+ <dim>-1</dim>
462
+ <dim>24</dim>
463
+ <dim>32</dim>
464
+ <dim>241</dim>
465
+ </port>
466
+ </output>
467
+ </layer>
468
+ <layer id="24" name="Reshape_11460_compressed" type="Const" version="opset1">
469
+ <data element_type="f16" shape="1, 24, 1, 1" offset="6000" size="48" />
470
+ <output>
471
+ <port id="0" precision="FP16">
472
+ <dim>1</dim>
473
+ <dim>24</dim>
474
+ <dim>1</dim>
475
+ <dim>1</dim>
476
+ </port>
477
+ </output>
478
+ </layer>
479
+ <layer id="25" name="Reshape_11460" type="Convert" version="opset1">
480
+ <data destination_type="f32" />
481
+ <rt_info>
482
+ <attribute name="decompression" version="0" />
483
+ </rt_info>
484
+ <input>
485
+ <port id="0" precision="FP16">
486
+ <dim>1</dim>
487
+ <dim>24</dim>
488
+ <dim>1</dim>
489
+ <dim>1</dim>
490
+ </port>
491
+ </input>
492
+ <output>
493
+ <port id="1" precision="FP32">
494
+ <dim>1</dim>
495
+ <dim>24</dim>
496
+ <dim>1</dim>
497
+ <dim>1</dim>
498
+ </port>
499
+ </output>
500
+ </layer>
501
+ <layer id="26" name="/backbone/single_blocks.1/Conv_1" type="Add" version="opset1">
502
+ <data auto_broadcast="numpy" />
503
+ <input>
504
+ <port id="0" precision="FP32">
505
+ <dim>-1</dim>
506
+ <dim>24</dim>
507
+ <dim>32</dim>
508
+ <dim>241</dim>
509
+ </port>
510
+ <port id="1" precision="FP32">
511
+ <dim>1</dim>
512
+ <dim>24</dim>
513
+ <dim>1</dim>
514
+ <dim>1</dim>
515
+ </port>
516
+ </input>
517
+ <output>
518
+ <port id="2" precision="FP32" names="/backbone/single_blocks.1/Conv_1_output_0">
519
+ <dim>-1</dim>
520
+ <dim>24</dim>
521
+ <dim>32</dim>
522
+ <dim>241</dim>
523
+ </port>
524
+ </output>
525
+ </layer>
526
+ <layer id="27" name="/backbone/single_blocks.1/Relu" type="ReLU" version="opset1">
527
+ <input>
528
+ <port id="0" precision="FP32">
529
+ <dim>-1</dim>
530
+ <dim>24</dim>
531
+ <dim>32</dim>
532
+ <dim>241</dim>
533
+ </port>
534
+ </input>
535
+ <output>
536
+ <port id="1" precision="FP32" names="/backbone/single_blocks.1/Relu_output_0">
537
+ <dim>-1</dim>
538
+ <dim>24</dim>
539
+ <dim>32</dim>
540
+ <dim>241</dim>
541
+ </port>
542
+ </output>
543
+ </layer>
544
+ <layer id="28" name="Reshape_11473_compressed" type="Const" version="opset1">
545
+ <data element_type="f16" shape="24, 1, 1, 5, 5" offset="6048" size="1200" />
546
+ <output>
547
+ <port id="0" precision="FP16">
548
+ <dim>24</dim>
549
+ <dim>1</dim>
550
+ <dim>1</dim>
551
+ <dim>5</dim>
552
+ <dim>5</dim>
553
+ </port>
554
+ </output>
555
+ </layer>
556
+ <layer id="29" name="Reshape_11473" type="Convert" version="opset1">
557
+ <data destination_type="f32" />
558
+ <rt_info>
559
+ <attribute name="decompression" version="0" />
560
+ </rt_info>
561
+ <input>
562
+ <port id="0" precision="FP16">
563
+ <dim>24</dim>
564
+ <dim>1</dim>
565
+ <dim>1</dim>
566
+ <dim>5</dim>
567
+ <dim>5</dim>
568
+ </port>
569
+ </input>
570
+ <output>
571
+ <port id="1" precision="FP32">
572
+ <dim>24</dim>
573
+ <dim>1</dim>
574
+ <dim>1</dim>
575
+ <dim>5</dim>
576
+ <dim>5</dim>
577
+ </port>
578
+ </output>
579
+ </layer>
580
+ <layer id="30" name="/backbone/single_blocks.2/0/Conv" type="GroupConvolution" version="opset1">
581
+ <data strides="2, 2" pads_begin="2, 2" pads_end="2, 2" dilations="1, 1" auto_pad="explicit" />
582
+ <input>
583
+ <port id="0" precision="FP32">
584
+ <dim>-1</dim>
585
+ <dim>24</dim>
586
+ <dim>32</dim>
587
+ <dim>241</dim>
588
+ </port>
589
+ <port id="1" precision="FP32">
590
+ <dim>24</dim>
591
+ <dim>1</dim>
592
+ <dim>1</dim>
593
+ <dim>5</dim>
594
+ <dim>5</dim>
595
+ </port>
596
+ </input>
597
+ <output>
598
+ <port id="2" precision="FP32" names="/backbone/single_blocks.2/0/Conv_output_0">
599
+ <dim>-1</dim>
600
+ <dim>24</dim>
601
+ <dim>16</dim>
602
+ <dim>121</dim>
603
+ </port>
604
+ </output>
605
+ </layer>
606
+ <layer id="31" name="onnx::Conv_144_compressed" type="Const" version="opset1">
607
+ <data element_type="f16" shape="48, 24, 1, 1" offset="7248" size="2304" />
608
+ <output>
609
+ <port id="0" precision="FP16" names="onnx::Conv_144">
610
+ <dim>48</dim>
611
+ <dim>24</dim>
612
+ <dim>1</dim>
613
+ <dim>1</dim>
614
+ </port>
615
+ </output>
616
+ </layer>
617
+ <layer id="32" name="onnx::Conv_144" type="Convert" version="opset1">
618
+ <data destination_type="f32" />
619
+ <rt_info>
620
+ <attribute name="decompression" version="0" />
621
+ </rt_info>
622
+ <input>
623
+ <port id="0" precision="FP16">
624
+ <dim>48</dim>
625
+ <dim>24</dim>
626
+ <dim>1</dim>
627
+ <dim>1</dim>
628
+ </port>
629
+ </input>
630
+ <output>
631
+ <port id="1" precision="FP32">
632
+ <dim>48</dim>
633
+ <dim>24</dim>
634
+ <dim>1</dim>
635
+ <dim>1</dim>
636
+ </port>
637
+ </output>
638
+ </layer>
639
+ <layer id="33" name="/backbone/single_blocks.2/1/Conv/WithoutBiases" type="Convolution" version="opset1">
640
+ <data strides="1, 1" dilations="1, 1" pads_begin="0, 0" pads_end="0, 0" auto_pad="explicit" />
641
+ <input>
642
+ <port id="0" precision="FP32">
643
+ <dim>-1</dim>
644
+ <dim>24</dim>
645
+ <dim>16</dim>
646
+ <dim>121</dim>
647
+ </port>
648
+ <port id="1" precision="FP32">
649
+ <dim>48</dim>
650
+ <dim>24</dim>
651
+ <dim>1</dim>
652
+ <dim>1</dim>
653
+ </port>
654
+ </input>
655
+ <output>
656
+ <port id="2" precision="FP32">
657
+ <dim>-1</dim>
658
+ <dim>48</dim>
659
+ <dim>16</dim>
660
+ <dim>121</dim>
661
+ </port>
662
+ </output>
663
+ </layer>
664
+ <layer id="34" name="Reshape_11526_compressed" type="Const" version="opset1">
665
+ <data element_type="f16" shape="1, 48, 1, 1" offset="9552" size="96" />
666
+ <output>
667
+ <port id="0" precision="FP16">
668
+ <dim>1</dim>
669
+ <dim>48</dim>
670
+ <dim>1</dim>
671
+ <dim>1</dim>
672
+ </port>
673
+ </output>
674
+ </layer>
675
+ <layer id="35" name="Reshape_11526" type="Convert" version="opset1">
676
+ <data destination_type="f32" />
677
+ <rt_info>
678
+ <attribute name="decompression" version="0" />
679
+ </rt_info>
680
+ <input>
681
+ <port id="0" precision="FP16">
682
+ <dim>1</dim>
683
+ <dim>48</dim>
684
+ <dim>1</dim>
685
+ <dim>1</dim>
686
+ </port>
687
+ </input>
688
+ <output>
689
+ <port id="1" precision="FP32">
690
+ <dim>1</dim>
691
+ <dim>48</dim>
692
+ <dim>1</dim>
693
+ <dim>1</dim>
694
+ </port>
695
+ </output>
696
+ </layer>
697
+ <layer id="36" name="/backbone/single_blocks.2/1/Conv" type="Add" version="opset1">
698
+ <data auto_broadcast="numpy" />
699
+ <input>
700
+ <port id="0" precision="FP32">
701
+ <dim>-1</dim>
702
+ <dim>48</dim>
703
+ <dim>16</dim>
704
+ <dim>121</dim>
705
+ </port>
706
+ <port id="1" precision="FP32">
707
+ <dim>1</dim>
708
+ <dim>48</dim>
709
+ <dim>1</dim>
710
+ <dim>1</dim>
711
+ </port>
712
+ </input>
713
+ <output>
714
+ <port id="2" precision="FP32" names="/backbone/single_blocks.2/1/Conv_output_0">
715
+ <dim>-1</dim>
716
+ <dim>48</dim>
717
+ <dim>16</dim>
718
+ <dim>121</dim>
719
+ </port>
720
+ </output>
721
+ </layer>
722
+ <layer id="37" name="/backbone/single_blocks.2/3/Relu" type="ReLU" version="opset1">
723
+ <input>
724
+ <port id="0" precision="FP32">
725
+ <dim>-1</dim>
726
+ <dim>48</dim>
727
+ <dim>16</dim>
728
+ <dim>121</dim>
729
+ </port>
730
+ </input>
731
+ <output>
732
+ <port id="1" precision="FP32" names="/backbone/single_blocks.2/3/Relu_output_0">
733
+ <dim>-1</dim>
734
+ <dim>48</dim>
735
+ <dim>16</dim>
736
+ <dim>121</dim>
737
+ </port>
738
+ </output>
739
+ </layer>
740
+ <layer id="38" name="Reshape_11539_compressed" type="Const" version="opset1">
741
+ <data element_type="f16" shape="48, 1, 1, 5, 5" offset="9648" size="2400" />
742
+ <output>
743
+ <port id="0" precision="FP16">
744
+ <dim>48</dim>
745
+ <dim>1</dim>
746
+ <dim>1</dim>
747
+ <dim>5</dim>
748
+ <dim>5</dim>
749
+ </port>
750
+ </output>
751
+ </layer>
752
+ <layer id="39" name="Reshape_11539" type="Convert" version="opset1">
753
+ <data destination_type="f32" />
754
+ <rt_info>
755
+ <attribute name="decompression" version="0" />
756
+ </rt_info>
757
+ <input>
758
+ <port id="0" precision="FP16">
759
+ <dim>48</dim>
760
+ <dim>1</dim>
761
+ <dim>1</dim>
762
+ <dim>5</dim>
763
+ <dim>5</dim>
764
+ </port>
765
+ </input>
766
+ <output>
767
+ <port id="1" precision="FP32">
768
+ <dim>48</dim>
769
+ <dim>1</dim>
770
+ <dim>1</dim>
771
+ <dim>5</dim>
772
+ <dim>5</dim>
773
+ </port>
774
+ </output>
775
+ </layer>
776
+ <layer id="40" name="/backbone/single_blocks.3/0/Conv" type="GroupConvolution" version="opset1">
777
+ <data strides="1, 1" pads_begin="2, 2" pads_end="2, 2" dilations="1, 1" auto_pad="explicit" />
778
+ <input>
779
+ <port id="0" precision="FP32">
780
+ <dim>-1</dim>
781
+ <dim>48</dim>
782
+ <dim>16</dim>
783
+ <dim>121</dim>
784
+ </port>
785
+ <port id="1" precision="FP32">
786
+ <dim>48</dim>
787
+ <dim>1</dim>
788
+ <dim>1</dim>
789
+ <dim>5</dim>
790
+ <dim>5</dim>
791
+ </port>
792
+ </input>
793
+ <output>
794
+ <port id="2" precision="FP32" names="/backbone/single_blocks.3/0/Conv_output_0">
795
+ <dim>-1</dim>
796
+ <dim>48</dim>
797
+ <dim>16</dim>
798
+ <dim>121</dim>
799
+ </port>
800
+ </output>
801
+ </layer>
802
+ <layer id="41" name="onnx::Conv_147_compressed" type="Const" version="opset1">
803
+ <data element_type="f16" shape="48, 48, 1, 1" offset="12048" size="4608" />
804
+ <output>
805
+ <port id="0" precision="FP16" names="onnx::Conv_147">
806
+ <dim>48</dim>
807
+ <dim>48</dim>
808
+ <dim>1</dim>
809
+ <dim>1</dim>
810
+ </port>
811
+ </output>
812
+ </layer>
813
+ <layer id="42" name="onnx::Conv_147" type="Convert" version="opset1">
814
+ <data destination_type="f32" />
815
+ <rt_info>
816
+ <attribute name="decompression" version="0" />
817
+ </rt_info>
818
+ <input>
819
+ <port id="0" precision="FP16">
820
+ <dim>48</dim>
821
+ <dim>48</dim>
822
+ <dim>1</dim>
823
+ <dim>1</dim>
824
+ </port>
825
+ </input>
826
+ <output>
827
+ <port id="1" precision="FP32">
828
+ <dim>48</dim>
829
+ <dim>48</dim>
830
+ <dim>1</dim>
831
+ <dim>1</dim>
832
+ </port>
833
+ </output>
834
+ </layer>
835
+ <layer id="43" name="/backbone/single_blocks.3/1/Conv/WithoutBiases" type="Convolution" version="opset1">
836
+ <data strides="1, 1" dilations="1, 1" pads_begin="0, 0" pads_end="0, 0" auto_pad="explicit" />
837
+ <input>
838
+ <port id="0" precision="FP32">
839
+ <dim>-1</dim>
840
+ <dim>48</dim>
841
+ <dim>16</dim>
842
+ <dim>121</dim>
843
+ </port>
844
+ <port id="1" precision="FP32">
845
+ <dim>48</dim>
846
+ <dim>48</dim>
847
+ <dim>1</dim>
848
+ <dim>1</dim>
849
+ </port>
850
+ </input>
851
+ <output>
852
+ <port id="2" precision="FP32">
853
+ <dim>-1</dim>
854
+ <dim>48</dim>
855
+ <dim>16</dim>
856
+ <dim>121</dim>
857
+ </port>
858
+ </output>
859
+ </layer>
860
+ <layer id="44" name="Reshape_11592_compressed" type="Const" version="opset1">
861
+ <data element_type="f16" shape="1, 48, 1, 1" offset="16656" size="96" />
862
+ <output>
863
+ <port id="0" precision="FP16">
864
+ <dim>1</dim>
865
+ <dim>48</dim>
866
+ <dim>1</dim>
867
+ <dim>1</dim>
868
+ </port>
869
+ </output>
870
+ </layer>
871
+ <layer id="45" name="Reshape_11592" type="Convert" version="opset1">
872
+ <data destination_type="f32" />
873
+ <rt_info>
874
+ <attribute name="decompression" version="0" />
875
+ </rt_info>
876
+ <input>
877
+ <port id="0" precision="FP16">
878
+ <dim>1</dim>
879
+ <dim>48</dim>
880
+ <dim>1</dim>
881
+ <dim>1</dim>
882
+ </port>
883
+ </input>
884
+ <output>
885
+ <port id="1" precision="FP32">
886
+ <dim>1</dim>
887
+ <dim>48</dim>
888
+ <dim>1</dim>
889
+ <dim>1</dim>
890
+ </port>
891
+ </output>
892
+ </layer>
893
+ <layer id="46" name="/backbone/single_blocks.3/1/Conv" type="Add" version="opset1">
894
+ <data auto_broadcast="numpy" />
895
+ <input>
896
+ <port id="0" precision="FP32">
897
+ <dim>-1</dim>
898
+ <dim>48</dim>
899
+ <dim>16</dim>
900
+ <dim>121</dim>
901
+ </port>
902
+ <port id="1" precision="FP32">
903
+ <dim>1</dim>
904
+ <dim>48</dim>
905
+ <dim>1</dim>
906
+ <dim>1</dim>
907
+ </port>
908
+ </input>
909
+ <output>
910
+ <port id="2" precision="FP32" names="/backbone/single_blocks.3/1/Conv_output_0">
911
+ <dim>-1</dim>
912
+ <dim>48</dim>
913
+ <dim>16</dim>
914
+ <dim>121</dim>
915
+ </port>
916
+ </output>
917
+ </layer>
918
+ <layer id="47" name="/backbone/single_blocks.3/3/Relu" type="ReLU" version="opset1">
919
+ <input>
920
+ <port id="0" precision="FP32">
921
+ <dim>-1</dim>
922
+ <dim>48</dim>
923
+ <dim>16</dim>
924
+ <dim>121</dim>
925
+ </port>
926
+ </input>
927
+ <output>
928
+ <port id="1" precision="FP32" names="/backbone/single_blocks.3/3/Relu_output_0">
929
+ <dim>-1</dim>
930
+ <dim>48</dim>
931
+ <dim>16</dim>
932
+ <dim>121</dim>
933
+ </port>
934
+ </output>
935
+ </layer>
936
+ <layer id="48" name="Reshape_11605_compressed" type="Const" version="opset1">
937
+ <data element_type="f16" shape="48, 1, 1, 5, 5" offset="16752" size="2400" />
938
+ <output>
939
+ <port id="0" precision="FP16">
940
+ <dim>48</dim>
941
+ <dim>1</dim>
942
+ <dim>1</dim>
943
+ <dim>5</dim>
944
+ <dim>5</dim>
945
+ </port>
946
+ </output>
947
+ </layer>
948
+ <layer id="49" name="Reshape_11605" type="Convert" version="opset1">
949
+ <data destination_type="f32" />
950
+ <rt_info>
951
+ <attribute name="decompression" version="0" />
952
+ </rt_info>
953
+ <input>
954
+ <port id="0" precision="FP16">
955
+ <dim>48</dim>
956
+ <dim>1</dim>
957
+ <dim>1</dim>
958
+ <dim>5</dim>
959
+ <dim>5</dim>
960
+ </port>
961
+ </input>
962
+ <output>
963
+ <port id="1" precision="FP32">
964
+ <dim>48</dim>
965
+ <dim>1</dim>
966
+ <dim>1</dim>
967
+ <dim>5</dim>
968
+ <dim>5</dim>
969
+ </port>
970
+ </output>
971
+ </layer>
972
+ <layer id="50" name="/backbone/single_blocks.4/Conv" type="GroupConvolution" version="opset1">
973
+ <data strides="1, 1" pads_begin="2, 2" pads_end="2, 2" dilations="1, 1" auto_pad="explicit" />
974
+ <input>
975
+ <port id="0" precision="FP32">
976
+ <dim>-1</dim>
977
+ <dim>48</dim>
978
+ <dim>16</dim>
979
+ <dim>121</dim>
980
+ </port>
981
+ <port id="1" precision="FP32">
982
+ <dim>48</dim>
983
+ <dim>1</dim>
984
+ <dim>1</dim>
985
+ <dim>5</dim>
986
+ <dim>5</dim>
987
+ </port>
988
+ </input>
989
+ <output>
990
+ <port id="2" precision="FP32" names="/backbone/single_blocks.4/Conv_output_0">
991
+ <dim>-1</dim>
992
+ <dim>48</dim>
993
+ <dim>16</dim>
994
+ <dim>121</dim>
995
+ </port>
996
+ </output>
997
+ </layer>
998
+ <layer id="51" name="onnx::Conv_150_compressed" type="Const" version="opset1">
999
+ <data element_type="f16" shape="48, 48, 1, 1" offset="19152" size="4608" />
1000
+ <output>
1001
+ <port id="0" precision="FP16" names="onnx::Conv_150">
1002
+ <dim>48</dim>
1003
+ <dim>48</dim>
1004
+ <dim>1</dim>
1005
+ <dim>1</dim>
1006
+ </port>
1007
+ </output>
1008
+ </layer>
1009
+ <layer id="52" name="onnx::Conv_150" type="Convert" version="opset1">
1010
+ <data destination_type="f32" />
1011
+ <rt_info>
1012
+ <attribute name="decompression" version="0" />
1013
+ </rt_info>
1014
+ <input>
1015
+ <port id="0" precision="FP16">
1016
+ <dim>48</dim>
1017
+ <dim>48</dim>
1018
+ <dim>1</dim>
1019
+ <dim>1</dim>
1020
+ </port>
1021
+ </input>
1022
+ <output>
1023
+ <port id="1" precision="FP32">
1024
+ <dim>48</dim>
1025
+ <dim>48</dim>
1026
+ <dim>1</dim>
1027
+ <dim>1</dim>
1028
+ </port>
1029
+ </output>
1030
+ </layer>
1031
+ <layer id="53" name="/backbone/single_blocks.4/Conv_1/WithoutBiases" type="Convolution" version="opset1">
1032
+ <data strides="1, 1" dilations="1, 1" pads_begin="0, 0" pads_end="0, 0" auto_pad="explicit" />
1033
+ <input>
1034
+ <port id="0" precision="FP32">
1035
+ <dim>-1</dim>
1036
+ <dim>48</dim>
1037
+ <dim>16</dim>
1038
+ <dim>121</dim>
1039
+ </port>
1040
+ <port id="1" precision="FP32">
1041
+ <dim>48</dim>
1042
+ <dim>48</dim>
1043
+ <dim>1</dim>
1044
+ <dim>1</dim>
1045
+ </port>
1046
+ </input>
1047
+ <output>
1048
+ <port id="2" precision="FP32">
1049
+ <dim>-1</dim>
1050
+ <dim>48</dim>
1051
+ <dim>16</dim>
1052
+ <dim>121</dim>
1053
+ </port>
1054
+ </output>
1055
+ </layer>
1056
+ <layer id="54" name="Reshape_11658_compressed" type="Const" version="opset1">
1057
+ <data element_type="f16" shape="1, 48, 1, 1" offset="23760" size="96" />
1058
+ <output>
1059
+ <port id="0" precision="FP16">
1060
+ <dim>1</dim>
1061
+ <dim>48</dim>
1062
+ <dim>1</dim>
1063
+ <dim>1</dim>
1064
+ </port>
1065
+ </output>
1066
+ </layer>
1067
+ <layer id="55" name="Reshape_11658" type="Convert" version="opset1">
1068
+ <data destination_type="f32" />
1069
+ <rt_info>
1070
+ <attribute name="decompression" version="0" />
1071
+ </rt_info>
1072
+ <input>
1073
+ <port id="0" precision="FP16">
1074
+ <dim>1</dim>
1075
+ <dim>48</dim>
1076
+ <dim>1</dim>
1077
+ <dim>1</dim>
1078
+ </port>
1079
+ </input>
1080
+ <output>
1081
+ <port id="1" precision="FP32">
1082
+ <dim>1</dim>
1083
+ <dim>48</dim>
1084
+ <dim>1</dim>
1085
+ <dim>1</dim>
1086
+ </port>
1087
+ </output>
1088
+ </layer>
1089
+ <layer id="56" name="/backbone/single_blocks.4/Conv_1" type="Add" version="opset1">
1090
+ <data auto_broadcast="numpy" />
1091
+ <input>
1092
+ <port id="0" precision="FP32">
1093
+ <dim>-1</dim>
1094
+ <dim>48</dim>
1095
+ <dim>16</dim>
1096
+ <dim>121</dim>
1097
+ </port>
1098
+ <port id="1" precision="FP32">
1099
+ <dim>1</dim>
1100
+ <dim>48</dim>
1101
+ <dim>1</dim>
1102
+ <dim>1</dim>
1103
+ </port>
1104
+ </input>
1105
+ <output>
1106
+ <port id="2" precision="FP32" names="/backbone/single_blocks.4/Conv_1_output_0">
1107
+ <dim>-1</dim>
1108
+ <dim>48</dim>
1109
+ <dim>16</dim>
1110
+ <dim>121</dim>
1111
+ </port>
1112
+ </output>
1113
+ </layer>
1114
+ <layer id="57" name="/backbone/single_blocks.4/Relu" type="ReLU" version="opset1">
1115
+ <input>
1116
+ <port id="0" precision="FP32">
1117
+ <dim>-1</dim>
1118
+ <dim>48</dim>
1119
+ <dim>16</dim>
1120
+ <dim>121</dim>
1121
+ </port>
1122
+ </input>
1123
+ <output>
1124
+ <port id="1" precision="FP32" names="/backbone/single_blocks.4/Relu_output_0">
1125
+ <dim>-1</dim>
1126
+ <dim>48</dim>
1127
+ <dim>16</dim>
1128
+ <dim>121</dim>
1129
+ </port>
1130
+ </output>
1131
+ </layer>
1132
+ <layer id="58" name="Reshape_11671_compressed" type="Const" version="opset1">
1133
+ <data element_type="f16" shape="48, 1, 1, 5, 5" offset="23856" size="2400" />
1134
+ <output>
1135
+ <port id="0" precision="FP16">
1136
+ <dim>48</dim>
1137
+ <dim>1</dim>
1138
+ <dim>1</dim>
1139
+ <dim>5</dim>
1140
+ <dim>5</dim>
1141
+ </port>
1142
+ </output>
1143
+ </layer>
1144
+ <layer id="59" name="Reshape_11671" type="Convert" version="opset1">
1145
+ <data destination_type="f32" />
1146
+ <rt_info>
1147
+ <attribute name="decompression" version="0" />
1148
+ </rt_info>
1149
+ <input>
1150
+ <port id="0" precision="FP16">
1151
+ <dim>48</dim>
1152
+ <dim>1</dim>
1153
+ <dim>1</dim>
1154
+ <dim>5</dim>
1155
+ <dim>5</dim>
1156
+ </port>
1157
+ </input>
1158
+ <output>
1159
+ <port id="1" precision="FP32">
1160
+ <dim>48</dim>
1161
+ <dim>1</dim>
1162
+ <dim>1</dim>
1163
+ <dim>5</dim>
1164
+ <dim>5</dim>
1165
+ </port>
1166
+ </output>
1167
+ </layer>
1168
+ <layer id="60" name="/backbone/0/0/Conv" type="GroupConvolution" version="opset1">
1169
+ <data strides="2, 2" pads_begin="2, 2" pads_end="2, 2" dilations="1, 1" auto_pad="explicit" />
1170
+ <input>
1171
+ <port id="0" precision="FP32">
1172
+ <dim>-1</dim>
1173
+ <dim>48</dim>
1174
+ <dim>16</dim>
1175
+ <dim>121</dim>
1176
+ </port>
1177
+ <port id="1" precision="FP32">
1178
+ <dim>48</dim>
1179
+ <dim>1</dim>
1180
+ <dim>1</dim>
1181
+ <dim>5</dim>
1182
+ <dim>5</dim>
1183
+ </port>
1184
+ </input>
1185
+ <output>
1186
+ <port id="2" precision="FP32" names="/backbone/0/0/Conv_output_0">
1187
+ <dim>-1</dim>
1188
+ <dim>48</dim>
1189
+ <dim>8</dim>
1190
+ <dim>61</dim>
1191
+ </port>
1192
+ </output>
1193
+ </layer>
1194
+ <layer id="61" name="onnx::Conv_153_compressed" type="Const" version="opset1">
1195
+ <data element_type="f16" shape="96, 48, 1, 1" offset="26256" size="9216" />
1196
+ <output>
1197
+ <port id="0" precision="FP16" names="onnx::Conv_153">
1198
+ <dim>96</dim>
1199
+ <dim>48</dim>
1200
+ <dim>1</dim>
1201
+ <dim>1</dim>
1202
+ </port>
1203
+ </output>
1204
+ </layer>
1205
+ <layer id="62" name="onnx::Conv_153" type="Convert" version="opset1">
1206
+ <data destination_type="f32" />
1207
+ <rt_info>
1208
+ <attribute name="decompression" version="0" />
1209
+ </rt_info>
1210
+ <input>
1211
+ <port id="0" precision="FP16">
1212
+ <dim>96</dim>
1213
+ <dim>48</dim>
1214
+ <dim>1</dim>
1215
+ <dim>1</dim>
1216
+ </port>
1217
+ </input>
1218
+ <output>
1219
+ <port id="1" precision="FP32">
1220
+ <dim>96</dim>
1221
+ <dim>48</dim>
1222
+ <dim>1</dim>
1223
+ <dim>1</dim>
1224
+ </port>
1225
+ </output>
1226
+ </layer>
1227
+ <layer id="63" name="/backbone/0/1/Conv/WithoutBiases" type="Convolution" version="opset1">
1228
+ <data strides="1, 1" dilations="1, 1" pads_begin="0, 0" pads_end="0, 0" auto_pad="explicit" />
1229
+ <input>
1230
+ <port id="0" precision="FP32">
1231
+ <dim>-1</dim>
1232
+ <dim>48</dim>
1233
+ <dim>8</dim>
1234
+ <dim>61</dim>
1235
+ </port>
1236
+ <port id="1" precision="FP32">
1237
+ <dim>96</dim>
1238
+ <dim>48</dim>
1239
+ <dim>1</dim>
1240
+ <dim>1</dim>
1241
+ </port>
1242
+ </input>
1243
+ <output>
1244
+ <port id="2" precision="FP32">
1245
+ <dim>-1</dim>
1246
+ <dim>96</dim>
1247
+ <dim>8</dim>
1248
+ <dim>61</dim>
1249
+ </port>
1250
+ </output>
1251
+ </layer>
1252
+ <layer id="64" name="Reshape_11724_compressed" type="Const" version="opset1">
1253
+ <data element_type="f16" shape="1, 96, 1, 1" offset="35472" size="192" />
1254
+ <output>
1255
+ <port id="0" precision="FP16">
1256
+ <dim>1</dim>
1257
+ <dim>96</dim>
1258
+ <dim>1</dim>
1259
+ <dim>1</dim>
1260
+ </port>
1261
+ </output>
1262
+ </layer>
1263
+ <layer id="65" name="Reshape_11724" type="Convert" version="opset1">
1264
+ <data destination_type="f32" />
1265
+ <rt_info>
1266
+ <attribute name="decompression" version="0" />
1267
+ </rt_info>
1268
+ <input>
1269
+ <port id="0" precision="FP16">
1270
+ <dim>1</dim>
1271
+ <dim>96</dim>
1272
+ <dim>1</dim>
1273
+ <dim>1</dim>
1274
+ </port>
1275
+ </input>
1276
+ <output>
1277
+ <port id="1" precision="FP32">
1278
+ <dim>1</dim>
1279
+ <dim>96</dim>
1280
+ <dim>1</dim>
1281
+ <dim>1</dim>
1282
+ </port>
1283
+ </output>
1284
+ </layer>
1285
+ <layer id="66" name="/backbone/0/1/Conv" type="Add" version="opset1">
1286
+ <data auto_broadcast="numpy" />
1287
+ <input>
1288
+ <port id="0" precision="FP32">
1289
+ <dim>-1</dim>
1290
+ <dim>96</dim>
1291
+ <dim>8</dim>
1292
+ <dim>61</dim>
1293
+ </port>
1294
+ <port id="1" precision="FP32">
1295
+ <dim>1</dim>
1296
+ <dim>96</dim>
1297
+ <dim>1</dim>
1298
+ <dim>1</dim>
1299
+ </port>
1300
+ </input>
1301
+ <output>
1302
+ <port id="2" precision="FP32" names="/backbone/0/1/Conv_output_0">
1303
+ <dim>-1</dim>
1304
+ <dim>96</dim>
1305
+ <dim>8</dim>
1306
+ <dim>61</dim>
1307
+ </port>
1308
+ </output>
1309
+ </layer>
1310
+ <layer id="67" name="/backbone/0/3/Relu" type="ReLU" version="opset1">
1311
+ <input>
1312
+ <port id="0" precision="FP32">
1313
+ <dim>-1</dim>
1314
+ <dim>96</dim>
1315
+ <dim>8</dim>
1316
+ <dim>61</dim>
1317
+ </port>
1318
+ </input>
1319
+ <output>
1320
+ <port id="1" precision="FP32" names="/backbone/0/3/Relu_output_0">
1321
+ <dim>-1</dim>
1322
+ <dim>96</dim>
1323
+ <dim>8</dim>
1324
+ <dim>61</dim>
1325
+ </port>
1326
+ </output>
1327
+ </layer>
1328
+ <layer id="68" name="Reshape_11737_compressed" type="Const" version="opset1">
1329
+ <data element_type="f16" shape="96, 1, 1, 5, 5" offset="35664" size="4800" />
1330
+ <output>
1331
+ <port id="0" precision="FP16">
1332
+ <dim>96</dim>
1333
+ <dim>1</dim>
1334
+ <dim>1</dim>
1335
+ <dim>5</dim>
1336
+ <dim>5</dim>
1337
+ </port>
1338
+ </output>
1339
+ </layer>
1340
+ <layer id="69" name="Reshape_11737" type="Convert" version="opset1">
1341
+ <data destination_type="f32" />
1342
+ <rt_info>
1343
+ <attribute name="decompression" version="0" />
1344
+ </rt_info>
1345
+ <input>
1346
+ <port id="0" precision="FP16">
1347
+ <dim>96</dim>
1348
+ <dim>1</dim>
1349
+ <dim>1</dim>
1350
+ <dim>5</dim>
1351
+ <dim>5</dim>
1352
+ </port>
1353
+ </input>
1354
+ <output>
1355
+ <port id="1" precision="FP32">
1356
+ <dim>96</dim>
1357
+ <dim>1</dim>
1358
+ <dim>1</dim>
1359
+ <dim>5</dim>
1360
+ <dim>5</dim>
1361
+ </port>
1362
+ </output>
1363
+ </layer>
1364
+ <layer id="70" name="/backbone/1/0/Conv" type="GroupConvolution" version="opset1">
1365
+ <data strides="1, 1" pads_begin="2, 2" pads_end="2, 2" dilations="1, 1" auto_pad="explicit" />
1366
+ <input>
1367
+ <port id="0" precision="FP32">
1368
+ <dim>-1</dim>
1369
+ <dim>96</dim>
1370
+ <dim>8</dim>
1371
+ <dim>61</dim>
1372
+ </port>
1373
+ <port id="1" precision="FP32">
1374
+ <dim>96</dim>
1375
+ <dim>1</dim>
1376
+ <dim>1</dim>
1377
+ <dim>5</dim>
1378
+ <dim>5</dim>
1379
+ </port>
1380
+ </input>
1381
+ <output>
1382
+ <port id="2" precision="FP32" names="/backbone/1/0/Conv_output_0">
1383
+ <dim>-1</dim>
1384
+ <dim>96</dim>
1385
+ <dim>8</dim>
1386
+ <dim>61</dim>
1387
+ </port>
1388
+ </output>
1389
+ </layer>
1390
+ <layer id="71" name="onnx::Conv_156_compressed" type="Const" version="opset1">
1391
+ <data element_type="f16" shape="96, 96, 1, 1" offset="40464" size="18432" />
1392
+ <output>
1393
+ <port id="0" precision="FP16" names="onnx::Conv_156">
1394
+ <dim>96</dim>
1395
+ <dim>96</dim>
1396
+ <dim>1</dim>
1397
+ <dim>1</dim>
1398
+ </port>
1399
+ </output>
1400
+ </layer>
1401
+ <layer id="72" name="onnx::Conv_156" type="Convert" version="opset1">
1402
+ <data destination_type="f32" />
1403
+ <rt_info>
1404
+ <attribute name="decompression" version="0" />
1405
+ </rt_info>
1406
+ <input>
1407
+ <port id="0" precision="FP16">
1408
+ <dim>96</dim>
1409
+ <dim>96</dim>
1410
+ <dim>1</dim>
1411
+ <dim>1</dim>
1412
+ </port>
1413
+ </input>
1414
+ <output>
1415
+ <port id="1" precision="FP32">
1416
+ <dim>96</dim>
1417
+ <dim>96</dim>
1418
+ <dim>1</dim>
1419
+ <dim>1</dim>
1420
+ </port>
1421
+ </output>
1422
+ </layer>
1423
+ <layer id="73" name="/backbone/1/1/Conv/WithoutBiases" type="Convolution" version="opset1">
1424
+ <data strides="1, 1" dilations="1, 1" pads_begin="0, 0" pads_end="0, 0" auto_pad="explicit" />
1425
+ <input>
1426
+ <port id="0" precision="FP32">
1427
+ <dim>-1</dim>
1428
+ <dim>96</dim>
1429
+ <dim>8</dim>
1430
+ <dim>61</dim>
1431
+ </port>
1432
+ <port id="1" precision="FP32">
1433
+ <dim>96</dim>
1434
+ <dim>96</dim>
1435
+ <dim>1</dim>
1436
+ <dim>1</dim>
1437
+ </port>
1438
+ </input>
1439
+ <output>
1440
+ <port id="2" precision="FP32">
1441
+ <dim>-1</dim>
1442
+ <dim>96</dim>
1443
+ <dim>8</dim>
1444
+ <dim>61</dim>
1445
+ </port>
1446
+ </output>
1447
+ </layer>
1448
+ <layer id="74" name="Reshape_11790_compressed" type="Const" version="opset1">
1449
+ <data element_type="f16" shape="1, 96, 1, 1" offset="58896" size="192" />
1450
+ <output>
1451
+ <port id="0" precision="FP16">
1452
+ <dim>1</dim>
1453
+ <dim>96</dim>
1454
+ <dim>1</dim>
1455
+ <dim>1</dim>
1456
+ </port>
1457
+ </output>
1458
+ </layer>
1459
+ <layer id="75" name="Reshape_11790" type="Convert" version="opset1">
1460
+ <data destination_type="f32" />
1461
+ <rt_info>
1462
+ <attribute name="decompression" version="0" />
1463
+ </rt_info>
1464
+ <input>
1465
+ <port id="0" precision="FP16">
1466
+ <dim>1</dim>
1467
+ <dim>96</dim>
1468
+ <dim>1</dim>
1469
+ <dim>1</dim>
1470
+ </port>
1471
+ </input>
1472
+ <output>
1473
+ <port id="1" precision="FP32">
1474
+ <dim>1</dim>
1475
+ <dim>96</dim>
1476
+ <dim>1</dim>
1477
+ <dim>1</dim>
1478
+ </port>
1479
+ </output>
1480
+ </layer>
1481
+ <layer id="76" name="/backbone/1/1/Conv" type="Add" version="opset1">
1482
+ <data auto_broadcast="numpy" />
1483
+ <input>
1484
+ <port id="0" precision="FP32">
1485
+ <dim>-1</dim>
1486
+ <dim>96</dim>
1487
+ <dim>8</dim>
1488
+ <dim>61</dim>
1489
+ </port>
1490
+ <port id="1" precision="FP32">
1491
+ <dim>1</dim>
1492
+ <dim>96</dim>
1493
+ <dim>1</dim>
1494
+ <dim>1</dim>
1495
+ </port>
1496
+ </input>
1497
+ <output>
1498
+ <port id="2" precision="FP32" names="/backbone/1/1/Conv_output_0">
1499
+ <dim>-1</dim>
1500
+ <dim>96</dim>
1501
+ <dim>8</dim>
1502
+ <dim>61</dim>
1503
+ </port>
1504
+ </output>
1505
+ </layer>
1506
+ <layer id="77" name="/backbone/1/3/Relu" type="ReLU" version="opset1">
1507
+ <input>
1508
+ <port id="0" precision="FP32">
1509
+ <dim>-1</dim>
1510
+ <dim>96</dim>
1511
+ <dim>8</dim>
1512
+ <dim>61</dim>
1513
+ </port>
1514
+ </input>
1515
+ <output>
1516
+ <port id="1" precision="FP32" names="/backbone/1/3/Relu_output_0">
1517
+ <dim>-1</dim>
1518
+ <dim>96</dim>
1519
+ <dim>8</dim>
1520
+ <dim>61</dim>
1521
+ </port>
1522
+ </output>
1523
+ </layer>
1524
+ <layer id="78" name="Reshape_11803_compressed" type="Const" version="opset1">
1525
+ <data element_type="f16" shape="96, 1, 1, 5, 5" offset="59088" size="4800" />
1526
+ <output>
1527
+ <port id="0" precision="FP16">
1528
+ <dim>96</dim>
1529
+ <dim>1</dim>
1530
+ <dim>1</dim>
1531
+ <dim>5</dim>
1532
+ <dim>5</dim>
1533
+ </port>
1534
+ </output>
1535
+ </layer>
1536
+ <layer id="79" name="Reshape_11803" type="Convert" version="opset1">
1537
+ <data destination_type="f32" />
1538
+ <rt_info>
1539
+ <attribute name="decompression" version="0" />
1540
+ </rt_info>
1541
+ <input>
1542
+ <port id="0" precision="FP16">
1543
+ <dim>96</dim>
1544
+ <dim>1</dim>
1545
+ <dim>1</dim>
1546
+ <dim>5</dim>
1547
+ <dim>5</dim>
1548
+ </port>
1549
+ </input>
1550
+ <output>
1551
+ <port id="1" precision="FP32">
1552
+ <dim>96</dim>
1553
+ <dim>1</dim>
1554
+ <dim>1</dim>
1555
+ <dim>5</dim>
1556
+ <dim>5</dim>
1557
+ </port>
1558
+ </output>
1559
+ </layer>
1560
+ <layer id="80" name="/backbone/2/0/Conv" type="GroupConvolution" version="opset1">
1561
+ <data strides="2, 2" pads_begin="2, 2" pads_end="2, 2" dilations="1, 1" auto_pad="explicit" />
1562
+ <input>
1563
+ <port id="0" precision="FP32">
1564
+ <dim>-1</dim>
1565
+ <dim>96</dim>
1566
+ <dim>8</dim>
1567
+ <dim>61</dim>
1568
+ </port>
1569
+ <port id="1" precision="FP32">
1570
+ <dim>96</dim>
1571
+ <dim>1</dim>
1572
+ <dim>1</dim>
1573
+ <dim>5</dim>
1574
+ <dim>5</dim>
1575
+ </port>
1576
+ </input>
1577
+ <output>
1578
+ <port id="2" precision="FP32" names="/backbone/2/0/Conv_output_0">
1579
+ <dim>-1</dim>
1580
+ <dim>96</dim>
1581
+ <dim>4</dim>
1582
+ <dim>31</dim>
1583
+ </port>
1584
+ </output>
1585
+ </layer>
1586
+ <layer id="81" name="onnx::Conv_159_compressed" type="Const" version="opset1">
1587
+ <data element_type="f16" shape="96, 96, 1, 1" offset="63888" size="18432" />
1588
+ <output>
1589
+ <port id="0" precision="FP16" names="onnx::Conv_159">
1590
+ <dim>96</dim>
1591
+ <dim>96</dim>
1592
+ <dim>1</dim>
1593
+ <dim>1</dim>
1594
+ </port>
1595
+ </output>
1596
+ </layer>
1597
+ <layer id="82" name="onnx::Conv_159" type="Convert" version="opset1">
1598
+ <data destination_type="f32" />
1599
+ <rt_info>
1600
+ <attribute name="decompression" version="0" />
1601
+ </rt_info>
1602
+ <input>
1603
+ <port id="0" precision="FP16">
1604
+ <dim>96</dim>
1605
+ <dim>96</dim>
1606
+ <dim>1</dim>
1607
+ <dim>1</dim>
1608
+ </port>
1609
+ </input>
1610
+ <output>
1611
+ <port id="1" precision="FP32">
1612
+ <dim>96</dim>
1613
+ <dim>96</dim>
1614
+ <dim>1</dim>
1615
+ <dim>1</dim>
1616
+ </port>
1617
+ </output>
1618
+ </layer>
1619
+ <layer id="83" name="/backbone/2/1/Conv/WithoutBiases" type="Convolution" version="opset1">
1620
+ <data strides="1, 1" dilations="1, 1" pads_begin="0, 0" pads_end="0, 0" auto_pad="explicit" />
1621
+ <input>
1622
+ <port id="0" precision="FP32">
1623
+ <dim>-1</dim>
1624
+ <dim>96</dim>
1625
+ <dim>4</dim>
1626
+ <dim>31</dim>
1627
+ </port>
1628
+ <port id="1" precision="FP32">
1629
+ <dim>96</dim>
1630
+ <dim>96</dim>
1631
+ <dim>1</dim>
1632
+ <dim>1</dim>
1633
+ </port>
1634
+ </input>
1635
+ <output>
1636
+ <port id="2" precision="FP32">
1637
+ <dim>-1</dim>
1638
+ <dim>96</dim>
1639
+ <dim>4</dim>
1640
+ <dim>31</dim>
1641
+ </port>
1642
+ </output>
1643
+ </layer>
1644
+ <layer id="84" name="Reshape_11856_compressed" type="Const" version="opset1">
1645
+ <data element_type="f16" shape="1, 96, 1, 1" offset="82320" size="192" />
1646
+ <output>
1647
+ <port id="0" precision="FP16">
1648
+ <dim>1</dim>
1649
+ <dim>96</dim>
1650
+ <dim>1</dim>
1651
+ <dim>1</dim>
1652
+ </port>
1653
+ </output>
1654
+ </layer>
1655
+ <layer id="85" name="Reshape_11856" type="Convert" version="opset1">
1656
+ <data destination_type="f32" />
1657
+ <rt_info>
1658
+ <attribute name="decompression" version="0" />
1659
+ </rt_info>
1660
+ <input>
1661
+ <port id="0" precision="FP16">
1662
+ <dim>1</dim>
1663
+ <dim>96</dim>
1664
+ <dim>1</dim>
1665
+ <dim>1</dim>
1666
+ </port>
1667
+ </input>
1668
+ <output>
1669
+ <port id="1" precision="FP32">
1670
+ <dim>1</dim>
1671
+ <dim>96</dim>
1672
+ <dim>1</dim>
1673
+ <dim>1</dim>
1674
+ </port>
1675
+ </output>
1676
+ </layer>
1677
+ <layer id="86" name="/backbone/2/1/Conv" type="Add" version="opset1">
1678
+ <data auto_broadcast="numpy" />
1679
+ <input>
1680
+ <port id="0" precision="FP32">
1681
+ <dim>-1</dim>
1682
+ <dim>96</dim>
1683
+ <dim>4</dim>
1684
+ <dim>31</dim>
1685
+ </port>
1686
+ <port id="1" precision="FP32">
1687
+ <dim>1</dim>
1688
+ <dim>96</dim>
1689
+ <dim>1</dim>
1690
+ <dim>1</dim>
1691
+ </port>
1692
+ </input>
1693
+ <output>
1694
+ <port id="2" precision="FP32" names="/backbone/2/1/Conv_output_0">
1695
+ <dim>-1</dim>
1696
+ <dim>96</dim>
1697
+ <dim>4</dim>
1698
+ <dim>31</dim>
1699
+ </port>
1700
+ </output>
1701
+ </layer>
1702
+ <layer id="87" name="/backbone/2/3/Relu" type="ReLU" version="opset1">
1703
+ <input>
1704
+ <port id="0" precision="FP32">
1705
+ <dim>-1</dim>
1706
+ <dim>96</dim>
1707
+ <dim>4</dim>
1708
+ <dim>31</dim>
1709
+ </port>
1710
+ </input>
1711
+ <output>
1712
+ <port id="1" precision="FP32" names="/backbone/2/3/Relu_output_0">
1713
+ <dim>-1</dim>
1714
+ <dim>96</dim>
1715
+ <dim>4</dim>
1716
+ <dim>31</dim>
1717
+ </port>
1718
+ </output>
1719
+ </layer>
1720
+ <layer id="88" name="Reshape_11869_compressed" type="Const" version="opset1">
1721
+ <data element_type="f16" shape="96, 1, 1, 5, 5" offset="82512" size="4800" />
1722
+ <output>
1723
+ <port id="0" precision="FP16">
1724
+ <dim>96</dim>
1725
+ <dim>1</dim>
1726
+ <dim>1</dim>
1727
+ <dim>5</dim>
1728
+ <dim>5</dim>
1729
+ </port>
1730
+ </output>
1731
+ </layer>
1732
+ <layer id="89" name="Reshape_11869" type="Convert" version="opset1">
1733
+ <data destination_type="f32" />
1734
+ <rt_info>
1735
+ <attribute name="decompression" version="0" />
1736
+ </rt_info>
1737
+ <input>
1738
+ <port id="0" precision="FP16">
1739
+ <dim>96</dim>
1740
+ <dim>1</dim>
1741
+ <dim>1</dim>
1742
+ <dim>5</dim>
1743
+ <dim>5</dim>
1744
+ </port>
1745
+ </input>
1746
+ <output>
1747
+ <port id="1" precision="FP32">
1748
+ <dim>96</dim>
1749
+ <dim>1</dim>
1750
+ <dim>1</dim>
1751
+ <dim>5</dim>
1752
+ <dim>5</dim>
1753
+ </port>
1754
+ </output>
1755
+ </layer>
1756
+ <layer id="90" name="/backbone/3/Conv" type="GroupConvolution" version="opset1">
1757
+ <data strides="1, 1" pads_begin="2, 2" pads_end="2, 2" dilations="1, 1" auto_pad="explicit" />
1758
+ <input>
1759
+ <port id="0" precision="FP32">
1760
+ <dim>-1</dim>
1761
+ <dim>96</dim>
1762
+ <dim>4</dim>
1763
+ <dim>31</dim>
1764
+ </port>
1765
+ <port id="1" precision="FP32">
1766
+ <dim>96</dim>
1767
+ <dim>1</dim>
1768
+ <dim>1</dim>
1769
+ <dim>5</dim>
1770
+ <dim>5</dim>
1771
+ </port>
1772
+ </input>
1773
+ <output>
1774
+ <port id="2" precision="FP32" names="/backbone/3/Conv_output_0">
1775
+ <dim>-1</dim>
1776
+ <dim>96</dim>
1777
+ <dim>4</dim>
1778
+ <dim>31</dim>
1779
+ </port>
1780
+ </output>
1781
+ </layer>
1782
+ <layer id="91" name="onnx::Conv_162_compressed" type="Const" version="opset1">
1783
+ <data element_type="f16" shape="96, 96, 1, 1" offset="87312" size="18432" />
1784
+ <output>
1785
+ <port id="0" precision="FP16" names="onnx::Conv_162">
1786
+ <dim>96</dim>
1787
+ <dim>96</dim>
1788
+ <dim>1</dim>
1789
+ <dim>1</dim>
1790
+ </port>
1791
+ </output>
1792
+ </layer>
1793
+ <layer id="92" name="onnx::Conv_162" type="Convert" version="opset1">
1794
+ <data destination_type="f32" />
1795
+ <rt_info>
1796
+ <attribute name="decompression" version="0" />
1797
+ </rt_info>
1798
+ <input>
1799
+ <port id="0" precision="FP16">
1800
+ <dim>96</dim>
1801
+ <dim>96</dim>
1802
+ <dim>1</dim>
1803
+ <dim>1</dim>
1804
+ </port>
1805
+ </input>
1806
+ <output>
1807
+ <port id="1" precision="FP32">
1808
+ <dim>96</dim>
1809
+ <dim>96</dim>
1810
+ <dim>1</dim>
1811
+ <dim>1</dim>
1812
+ </port>
1813
+ </output>
1814
+ </layer>
1815
+ <layer id="93" name="/backbone/3/Conv_1/WithoutBiases" type="Convolution" version="opset1">
1816
+ <data strides="1, 1" dilations="1, 1" pads_begin="0, 0" pads_end="0, 0" auto_pad="explicit" />
1817
+ <input>
1818
+ <port id="0" precision="FP32">
1819
+ <dim>-1</dim>
1820
+ <dim>96</dim>
1821
+ <dim>4</dim>
1822
+ <dim>31</dim>
1823
+ </port>
1824
+ <port id="1" precision="FP32">
1825
+ <dim>96</dim>
1826
+ <dim>96</dim>
1827
+ <dim>1</dim>
1828
+ <dim>1</dim>
1829
+ </port>
1830
+ </input>
1831
+ <output>
1832
+ <port id="2" precision="FP32">
1833
+ <dim>-1</dim>
1834
+ <dim>96</dim>
1835
+ <dim>4</dim>
1836
+ <dim>31</dim>
1837
+ </port>
1838
+ </output>
1839
+ </layer>
1840
+ <layer id="94" name="Reshape_11922_compressed" type="Const" version="opset1">
1841
+ <data element_type="f16" shape="1, 96, 1, 1" offset="105744" size="192" />
1842
+ <output>
1843
+ <port id="0" precision="FP16">
1844
+ <dim>1</dim>
1845
+ <dim>96</dim>
1846
+ <dim>1</dim>
1847
+ <dim>1</dim>
1848
+ </port>
1849
+ </output>
1850
+ </layer>
1851
+ <layer id="95" name="Reshape_11922" type="Convert" version="opset1">
1852
+ <data destination_type="f32" />
1853
+ <rt_info>
1854
+ <attribute name="decompression" version="0" />
1855
+ </rt_info>
1856
+ <input>
1857
+ <port id="0" precision="FP16">
1858
+ <dim>1</dim>
1859
+ <dim>96</dim>
1860
+ <dim>1</dim>
1861
+ <dim>1</dim>
1862
+ </port>
1863
+ </input>
1864
+ <output>
1865
+ <port id="1" precision="FP32">
1866
+ <dim>1</dim>
1867
+ <dim>96</dim>
1868
+ <dim>1</dim>
1869
+ <dim>1</dim>
1870
+ </port>
1871
+ </output>
1872
+ </layer>
1873
+ <layer id="96" name="/backbone/3/Conv_1" type="Add" version="opset1">
1874
+ <data auto_broadcast="numpy" />
1875
+ <input>
1876
+ <port id="0" precision="FP32">
1877
+ <dim>-1</dim>
1878
+ <dim>96</dim>
1879
+ <dim>4</dim>
1880
+ <dim>31</dim>
1881
+ </port>
1882
+ <port id="1" precision="FP32">
1883
+ <dim>1</dim>
1884
+ <dim>96</dim>
1885
+ <dim>1</dim>
1886
+ <dim>1</dim>
1887
+ </port>
1888
+ </input>
1889
+ <output>
1890
+ <port id="2" precision="FP32" names="/backbone/3/Conv_1_output_0">
1891
+ <dim>-1</dim>
1892
+ <dim>96</dim>
1893
+ <dim>4</dim>
1894
+ <dim>31</dim>
1895
+ </port>
1896
+ </output>
1897
+ </layer>
1898
+ <layer id="97" name="/backbone/3/Relu" type="ReLU" version="opset1">
1899
+ <input>
1900
+ <port id="0" precision="FP32">
1901
+ <dim>-1</dim>
1902
+ <dim>96</dim>
1903
+ <dim>4</dim>
1904
+ <dim>31</dim>
1905
+ </port>
1906
+ </input>
1907
+ <output>
1908
+ <port id="1" precision="FP32" names="/backbone/3/Relu_output_0">
1909
+ <dim>-1</dim>
1910
+ <dim>96</dim>
1911
+ <dim>4</dim>
1912
+ <dim>31</dim>
1913
+ </port>
1914
+ </output>
1915
+ </layer>
1916
+ <layer id="98" name="Reshape_11935_compressed" type="Const" version="opset1">
1917
+ <data element_type="f16" shape="96, 1, 1, 5, 5" offset="105936" size="4800" />
1918
+ <output>
1919
+ <port id="0" precision="FP16">
1920
+ <dim>96</dim>
1921
+ <dim>1</dim>
1922
+ <dim>1</dim>
1923
+ <dim>5</dim>
1924
+ <dim>5</dim>
1925
+ </port>
1926
+ </output>
1927
+ </layer>
1928
+ <layer id="99" name="Reshape_11935" type="Convert" version="opset1">
1929
+ <data destination_type="f32" />
1930
+ <rt_info>
1931
+ <attribute name="decompression" version="0" />
1932
+ </rt_info>
1933
+ <input>
1934
+ <port id="0" precision="FP16">
1935
+ <dim>96</dim>
1936
+ <dim>1</dim>
1937
+ <dim>1</dim>
1938
+ <dim>5</dim>
1939
+ <dim>5</dim>
1940
+ </port>
1941
+ </input>
1942
+ <output>
1943
+ <port id="1" precision="FP32">
1944
+ <dim>96</dim>
1945
+ <dim>1</dim>
1946
+ <dim>1</dim>
1947
+ <dim>5</dim>
1948
+ <dim>5</dim>
1949
+ </port>
1950
+ </output>
1951
+ </layer>
1952
+ <layer id="100" name="/backbone/4/Conv" type="GroupConvolution" version="opset1">
1953
+ <data strides="2, 2" pads_begin="2, 2" pads_end="2, 2" dilations="1, 1" auto_pad="explicit" />
1954
+ <input>
1955
+ <port id="0" precision="FP32">
1956
+ <dim>-1</dim>
1957
+ <dim>96</dim>
1958
+ <dim>4</dim>
1959
+ <dim>31</dim>
1960
+ </port>
1961
+ <port id="1" precision="FP32">
1962
+ <dim>96</dim>
1963
+ <dim>1</dim>
1964
+ <dim>1</dim>
1965
+ <dim>5</dim>
1966
+ <dim>5</dim>
1967
+ </port>
1968
+ </input>
1969
+ <output>
1970
+ <port id="2" precision="FP32" names="/backbone/4/Conv_output_0">
1971
+ <dim>-1</dim>
1972
+ <dim>96</dim>
1973
+ <dim>2</dim>
1974
+ <dim>16</dim>
1975
+ </port>
1976
+ </output>
1977
+ </layer>
1978
+ <layer id="101" name="onnx::Conv_165_compressed" type="Const" version="opset1">
1979
+ <data element_type="f16" shape="96, 96, 1, 1" offset="110736" size="18432" />
1980
+ <output>
1981
+ <port id="0" precision="FP16" names="onnx::Conv_165">
1982
+ <dim>96</dim>
1983
+ <dim>96</dim>
1984
+ <dim>1</dim>
1985
+ <dim>1</dim>
1986
+ </port>
1987
+ </output>
1988
+ </layer>
1989
+ <layer id="102" name="onnx::Conv_165" type="Convert" version="opset1">
1990
+ <data destination_type="f32" />
1991
+ <rt_info>
1992
+ <attribute name="decompression" version="0" />
1993
+ </rt_info>
1994
+ <input>
1995
+ <port id="0" precision="FP16">
1996
+ <dim>96</dim>
1997
+ <dim>96</dim>
1998
+ <dim>1</dim>
1999
+ <dim>1</dim>
2000
+ </port>
2001
+ </input>
2002
+ <output>
2003
+ <port id="1" precision="FP32">
2004
+ <dim>96</dim>
2005
+ <dim>96</dim>
2006
+ <dim>1</dim>
2007
+ <dim>1</dim>
2008
+ </port>
2009
+ </output>
2010
+ </layer>
2011
+ <layer id="103" name="/backbone/4/Conv_1/WithoutBiases" type="Convolution" version="opset1">
2012
+ <data strides="1, 1" dilations="1, 1" pads_begin="0, 0" pads_end="0, 0" auto_pad="explicit" />
2013
+ <input>
2014
+ <port id="0" precision="FP32">
2015
+ <dim>-1</dim>
2016
+ <dim>96</dim>
2017
+ <dim>2</dim>
2018
+ <dim>16</dim>
2019
+ </port>
2020
+ <port id="1" precision="FP32">
2021
+ <dim>96</dim>
2022
+ <dim>96</dim>
2023
+ <dim>1</dim>
2024
+ <dim>1</dim>
2025
+ </port>
2026
+ </input>
2027
+ <output>
2028
+ <port id="2" precision="FP32">
2029
+ <dim>-1</dim>
2030
+ <dim>96</dim>
2031
+ <dim>2</dim>
2032
+ <dim>16</dim>
2033
+ </port>
2034
+ </output>
2035
+ </layer>
2036
+ <layer id="104" name="Reshape_11988_compressed" type="Const" version="opset1">
2037
+ <data element_type="f16" shape="1, 96, 1, 1" offset="129168" size="192" />
2038
+ <output>
2039
+ <port id="0" precision="FP16">
2040
+ <dim>1</dim>
2041
+ <dim>96</dim>
2042
+ <dim>1</dim>
2043
+ <dim>1</dim>
2044
+ </port>
2045
+ </output>
2046
+ </layer>
2047
+ <layer id="105" name="Reshape_11988" type="Convert" version="opset1">
2048
+ <data destination_type="f32" />
2049
+ <rt_info>
2050
+ <attribute name="decompression" version="0" />
2051
+ </rt_info>
2052
+ <input>
2053
+ <port id="0" precision="FP16">
2054
+ <dim>1</dim>
2055
+ <dim>96</dim>
2056
+ <dim>1</dim>
2057
+ <dim>1</dim>
2058
+ </port>
2059
+ </input>
2060
+ <output>
2061
+ <port id="1" precision="FP32">
2062
+ <dim>1</dim>
2063
+ <dim>96</dim>
2064
+ <dim>1</dim>
2065
+ <dim>1</dim>
2066
+ </port>
2067
+ </output>
2068
+ </layer>
2069
+ <layer id="106" name="/backbone/4/Conv_1" type="Add" version="opset1">
2070
+ <data auto_broadcast="numpy" />
2071
+ <input>
2072
+ <port id="0" precision="FP32">
2073
+ <dim>-1</dim>
2074
+ <dim>96</dim>
2075
+ <dim>2</dim>
2076
+ <dim>16</dim>
2077
+ </port>
2078
+ <port id="1" precision="FP32">
2079
+ <dim>1</dim>
2080
+ <dim>96</dim>
2081
+ <dim>1</dim>
2082
+ <dim>1</dim>
2083
+ </port>
2084
+ </input>
2085
+ <output>
2086
+ <port id="2" precision="FP32" names="/backbone/4/Conv_1_output_0">
2087
+ <dim>-1</dim>
2088
+ <dim>96</dim>
2089
+ <dim>2</dim>
2090
+ <dim>16</dim>
2091
+ </port>
2092
+ </output>
2093
+ </layer>
2094
+ <layer id="107" name="/backbone/4/Relu" type="ReLU" version="opset1">
2095
+ <input>
2096
+ <port id="0" precision="FP32">
2097
+ <dim>-1</dim>
2098
+ <dim>96</dim>
2099
+ <dim>2</dim>
2100
+ <dim>16</dim>
2101
+ </port>
2102
+ </input>
2103
+ <output>
2104
+ <port id="1" precision="FP32" names="/backbone/4/Relu_output_0">
2105
+ <dim>-1</dim>
2106
+ <dim>96</dim>
2107
+ <dim>2</dim>
2108
+ <dim>16</dim>
2109
+ </port>
2110
+ </output>
2111
+ </layer>
2112
+ <layer id="108" name="onnx::Conv_168_compressed" type="Const" version="opset1">
2113
+ <data element_type="f16" shape="64, 96, 1, 1" offset="129360" size="12288" />
2114
+ <output>
2115
+ <port id="0" precision="FP16" names="onnx::Conv_168">
2116
+ <dim>64</dim>
2117
+ <dim>96</dim>
2118
+ <dim>1</dim>
2119
+ <dim>1</dim>
2120
+ </port>
2121
+ </output>
2122
+ </layer>
2123
+ <layer id="109" name="onnx::Conv_168" type="Convert" version="opset1">
2124
+ <data destination_type="f32" />
2125
+ <rt_info>
2126
+ <attribute name="decompression" version="0" />
2127
+ </rt_info>
2128
+ <input>
2129
+ <port id="0" precision="FP16">
2130
+ <dim>64</dim>
2131
+ <dim>96</dim>
2132
+ <dim>1</dim>
2133
+ <dim>1</dim>
2134
+ </port>
2135
+ </input>
2136
+ <output>
2137
+ <port id="1" precision="FP32">
2138
+ <dim>64</dim>
2139
+ <dim>96</dim>
2140
+ <dim>1</dim>
2141
+ <dim>1</dim>
2142
+ </port>
2143
+ </output>
2144
+ </layer>
2145
+ <layer id="110" name="/backbone/conv_head/Conv/WithoutBiases" type="Convolution" version="opset1">
2146
+ <data strides="1, 1" dilations="1, 1" pads_begin="0, 0" pads_end="0, 0" auto_pad="explicit" />
2147
+ <input>
2148
+ <port id="0" precision="FP32">
2149
+ <dim>-1</dim>
2150
+ <dim>96</dim>
2151
+ <dim>2</dim>
2152
+ <dim>16</dim>
2153
+ </port>
2154
+ <port id="1" precision="FP32">
2155
+ <dim>64</dim>
2156
+ <dim>96</dim>
2157
+ <dim>1</dim>
2158
+ <dim>1</dim>
2159
+ </port>
2160
+ </input>
2161
+ <output>
2162
+ <port id="2" precision="FP32">
2163
+ <dim>-1</dim>
2164
+ <dim>64</dim>
2165
+ <dim>2</dim>
2166
+ <dim>16</dim>
2167
+ </port>
2168
+ </output>
2169
+ </layer>
2170
+ <layer id="111" name="Reshape_12005_compressed" type="Const" version="opset1">
2171
+ <data element_type="f16" shape="1, 64, 1, 1" offset="141648" size="128" />
2172
+ <output>
2173
+ <port id="0" precision="FP16">
2174
+ <dim>1</dim>
2175
+ <dim>64</dim>
2176
+ <dim>1</dim>
2177
+ <dim>1</dim>
2178
+ </port>
2179
+ </output>
2180
+ </layer>
2181
+ <layer id="112" name="Reshape_12005" type="Convert" version="opset1">
2182
+ <data destination_type="f32" />
2183
+ <rt_info>
2184
+ <attribute name="decompression" version="0" />
2185
+ </rt_info>
2186
+ <input>
2187
+ <port id="0" precision="FP16">
2188
+ <dim>1</dim>
2189
+ <dim>64</dim>
2190
+ <dim>1</dim>
2191
+ <dim>1</dim>
2192
+ </port>
2193
+ </input>
2194
+ <output>
2195
+ <port id="1" precision="FP32">
2196
+ <dim>1</dim>
2197
+ <dim>64</dim>
2198
+ <dim>1</dim>
2199
+ <dim>1</dim>
2200
+ </port>
2201
+ </output>
2202
+ </layer>
2203
+ <layer id="113" name="/backbone/conv_head/Conv" type="Add" version="opset1">
2204
+ <data auto_broadcast="numpy" />
2205
+ <input>
2206
+ <port id="0" precision="FP32">
2207
+ <dim>-1</dim>
2208
+ <dim>64</dim>
2209
+ <dim>2</dim>
2210
+ <dim>16</dim>
2211
+ </port>
2212
+ <port id="1" precision="FP32">
2213
+ <dim>1</dim>
2214
+ <dim>64</dim>
2215
+ <dim>1</dim>
2216
+ <dim>1</dim>
2217
+ </port>
2218
+ </input>
2219
+ <output>
2220
+ <port id="2" precision="FP32" names="/backbone/conv_head/Conv_output_0">
2221
+ <dim>-1</dim>
2222
+ <dim>64</dim>
2223
+ <dim>2</dim>
2224
+ <dim>16</dim>
2225
+ </port>
2226
+ </output>
2227
+ </layer>
2228
+ <layer id="114" name="/backbone/Relu" type="ReLU" version="opset1">
2229
+ <input>
2230
+ <port id="0" precision="FP32">
2231
+ <dim>-1</dim>
2232
+ <dim>64</dim>
2233
+ <dim>2</dim>
2234
+ <dim>16</dim>
2235
+ </port>
2236
+ </input>
2237
+ <output>
2238
+ <port id="1" precision="FP32" names="/backbone/Relu_output_0">
2239
+ <dim>-1</dim>
2240
+ <dim>64</dim>
2241
+ <dim>2</dim>
2242
+ <dim>16</dim>
2243
+ </port>
2244
+ </output>
2245
+ </layer>
2246
+ <layer id="115" name="Range_12016" type="Const" version="opset1">
2247
+ <data element_type="i64" shape="2" offset="141776" size="16" />
2248
+ <output>
2249
+ <port id="0" precision="I64">
2250
+ <dim>2</dim>
2251
+ </port>
2252
+ </output>
2253
+ </layer>
2254
+ <layer id="116" name="/backbone/global_avg_pooling/GlobalAveragePool" type="ReduceMean" version="opset1">
2255
+ <data keep_dims="true" />
2256
+ <input>
2257
+ <port id="0" precision="FP32">
2258
+ <dim>-1</dim>
2259
+ <dim>64</dim>
2260
+ <dim>2</dim>
2261
+ <dim>16</dim>
2262
+ </port>
2263
+ <port id="1" precision="I64">
2264
+ <dim>2</dim>
2265
+ </port>
2266
+ </input>
2267
+ <output>
2268
+ <port id="2" precision="FP32" names="/backbone/global_avg_pooling/GlobalAveragePool_output_0">
2269
+ <dim>-1</dim>
2270
+ <dim>64</dim>
2271
+ <dim>1</dim>
2272
+ <dim>1</dim>
2273
+ </port>
2274
+ </output>
2275
+ </layer>
2276
+ <layer id="117" name="Constant_12022" type="Const" version="opset1">
2277
+ <data element_type="i64" shape="2" offset="141792" size="16" />
2278
+ <rt_info>
2279
+ <attribute name="precise" version="0" />
2280
+ </rt_info>
2281
+ <output>
2282
+ <port id="0" precision="I64">
2283
+ <dim>2</dim>
2284
+ </port>
2285
+ </output>
2286
+ </layer>
2287
+ <layer id="118" name="/backbone/Flatten" type="Reshape" version="opset1">
2288
+ <data special_zero="true" />
2289
+ <input>
2290
+ <port id="0" precision="FP32">
2291
+ <dim>-1</dim>
2292
+ <dim>64</dim>
2293
+ <dim>1</dim>
2294
+ <dim>1</dim>
2295
+ </port>
2296
+ <port id="1" precision="I64">
2297
+ <dim>2</dim>
2298
+ </port>
2299
+ </input>
2300
+ <output>
2301
+ <port id="2" precision="FP32" names="/backbone/Flatten_output_0">
2302
+ <dim>-1</dim>
2303
+ <dim>64</dim>
2304
+ </port>
2305
+ </output>
2306
+ </layer>
2307
+ <layer id="119" name="fc.weight_compressed" type="Const" version="opset1">
2308
+ <data element_type="f16" shape="2, 64" offset="141808" size="256" />
2309
+ <output>
2310
+ <port id="0" precision="FP16" names="fc.weight">
2311
+ <dim>2</dim>
2312
+ <dim>64</dim>
2313
+ </port>
2314
+ </output>
2315
+ </layer>
2316
+ <layer id="120" name="fc.weight" type="Convert" version="opset1">
2317
+ <data destination_type="f32" />
2318
+ <rt_info>
2319
+ <attribute name="decompression" version="0" />
2320
+ </rt_info>
2321
+ <input>
2322
+ <port id="0" precision="FP16">
2323
+ <dim>2</dim>
2324
+ <dim>64</dim>
2325
+ </port>
2326
+ </input>
2327
+ <output>
2328
+ <port id="1" precision="FP32">
2329
+ <dim>2</dim>
2330
+ <dim>64</dim>
2331
+ </port>
2332
+ </output>
2333
+ </layer>
2334
+ <layer id="121" name="/fc/Gemm/WithoutBiases" type="MatMul" version="opset1">
2335
+ <data transpose_a="false" transpose_b="true" />
2336
+ <input>
2337
+ <port id="0" precision="FP32">
2338
+ <dim>-1</dim>
2339
+ <dim>64</dim>
2340
+ </port>
2341
+ <port id="1" precision="FP32">
2342
+ <dim>2</dim>
2343
+ <dim>64</dim>
2344
+ </port>
2345
+ </input>
2346
+ <output>
2347
+ <port id="2" precision="FP32">
2348
+ <dim>-1</dim>
2349
+ <dim>2</dim>
2350
+ </port>
2351
+ </output>
2352
+ </layer>
2353
+ <layer id="122" name="Constant_14088_compressed" type="Const" version="opset1">
2354
+ <data element_type="f16" shape="1, 2" offset="142064" size="4" />
2355
+ <output>
2356
+ <port id="0" precision="FP16">
2357
+ <dim>1</dim>
2358
+ <dim>2</dim>
2359
+ </port>
2360
+ </output>
2361
+ </layer>
2362
+ <layer id="123" name="Constant_14088" type="Convert" version="opset1">
2363
+ <data destination_type="f32" />
2364
+ <rt_info>
2365
+ <attribute name="decompression" version="0" />
2366
+ </rt_info>
2367
+ <input>
2368
+ <port id="0" precision="FP16">
2369
+ <dim>1</dim>
2370
+ <dim>2</dim>
2371
+ </port>
2372
+ </input>
2373
+ <output>
2374
+ <port id="1" precision="FP32">
2375
+ <dim>1</dim>
2376
+ <dim>2</dim>
2377
+ </port>
2378
+ </output>
2379
+ </layer>
2380
+ <layer id="124" name="output" type="Add" version="opset1">
2381
+ <data auto_broadcast="numpy" />
2382
+ <input>
2383
+ <port id="0" precision="FP32">
2384
+ <dim>-1</dim>
2385
+ <dim>2</dim>
2386
+ </port>
2387
+ <port id="1" precision="FP32">
2388
+ <dim>1</dim>
2389
+ <dim>2</dim>
2390
+ </port>
2391
+ </input>
2392
+ <output>
2393
+ <port id="2" precision="FP32" names="output">
2394
+ <dim>-1</dim>
2395
+ <dim>2</dim>
2396
+ </port>
2397
+ </output>
2398
+ </layer>
2399
+ <layer id="125" name="output/sink_port_0" type="Result" version="opset1">
2400
+ <input>
2401
+ <port id="0" precision="FP32">
2402
+ <dim>-1</dim>
2403
+ <dim>2</dim>
2404
+ </port>
2405
+ </input>
2406
+ </layer>
2407
+ </layers>
2408
+ <edges>
2409
+ <edge from-layer="0" from-port="0" to-layer="3" to-port="0" />
2410
+ <edge from-layer="1" from-port="0" to-layer="2" to-port="0" />
2411
+ <edge from-layer="2" from-port="1" to-layer="3" to-port="1" />
2412
+ <edge from-layer="3" from-port="2" to-layer="6" to-port="0" />
2413
+ <edge from-layer="4" from-port="0" to-layer="5" to-port="0" />
2414
+ <edge from-layer="5" from-port="1" to-layer="6" to-port="1" />
2415
+ <edge from-layer="6" from-port="2" to-layer="7" to-port="0" />
2416
+ <edge from-layer="7" from-port="1" to-layer="10" to-port="0" />
2417
+ <edge from-layer="8" from-port="0" to-layer="9" to-port="0" />
2418
+ <edge from-layer="9" from-port="1" to-layer="10" to-port="1" />
2419
+ <edge from-layer="10" from-port="2" to-layer="13" to-port="0" />
2420
+ <edge from-layer="11" from-port="0" to-layer="12" to-port="0" />
2421
+ <edge from-layer="12" from-port="1" to-layer="13" to-port="1" />
2422
+ <edge from-layer="13" from-port="2" to-layer="16" to-port="0" />
2423
+ <edge from-layer="14" from-port="0" to-layer="15" to-port="0" />
2424
+ <edge from-layer="15" from-port="1" to-layer="16" to-port="1" />
2425
+ <edge from-layer="16" from-port="2" to-layer="17" to-port="0" />
2426
+ <edge from-layer="17" from-port="1" to-layer="20" to-port="0" />
2427
+ <edge from-layer="18" from-port="0" to-layer="19" to-port="0" />
2428
+ <edge from-layer="19" from-port="1" to-layer="20" to-port="1" />
2429
+ <edge from-layer="20" from-port="2" to-layer="23" to-port="0" />
2430
+ <edge from-layer="21" from-port="0" to-layer="22" to-port="0" />
2431
+ <edge from-layer="22" from-port="1" to-layer="23" to-port="1" />
2432
+ <edge from-layer="23" from-port="2" to-layer="26" to-port="0" />
2433
+ <edge from-layer="24" from-port="0" to-layer="25" to-port="0" />
2434
+ <edge from-layer="25" from-port="1" to-layer="26" to-port="1" />
2435
+ <edge from-layer="26" from-port="2" to-layer="27" to-port="0" />
2436
+ <edge from-layer="27" from-port="1" to-layer="30" to-port="0" />
2437
+ <edge from-layer="28" from-port="0" to-layer="29" to-port="0" />
2438
+ <edge from-layer="29" from-port="1" to-layer="30" to-port="1" />
2439
+ <edge from-layer="30" from-port="2" to-layer="33" to-port="0" />
2440
+ <edge from-layer="31" from-port="0" to-layer="32" to-port="0" />
2441
+ <edge from-layer="32" from-port="1" to-layer="33" to-port="1" />
2442
+ <edge from-layer="33" from-port="2" to-layer="36" to-port="0" />
2443
+ <edge from-layer="34" from-port="0" to-layer="35" to-port="0" />
2444
+ <edge from-layer="35" from-port="1" to-layer="36" to-port="1" />
2445
+ <edge from-layer="36" from-port="2" to-layer="37" to-port="0" />
2446
+ <edge from-layer="37" from-port="1" to-layer="40" to-port="0" />
2447
+ <edge from-layer="38" from-port="0" to-layer="39" to-port="0" />
2448
+ <edge from-layer="39" from-port="1" to-layer="40" to-port="1" />
2449
+ <edge from-layer="40" from-port="2" to-layer="43" to-port="0" />
2450
+ <edge from-layer="41" from-port="0" to-layer="42" to-port="0" />
2451
+ <edge from-layer="42" from-port="1" to-layer="43" to-port="1" />
2452
+ <edge from-layer="43" from-port="2" to-layer="46" to-port="0" />
2453
+ <edge from-layer="44" from-port="0" to-layer="45" to-port="0" />
2454
+ <edge from-layer="45" from-port="1" to-layer="46" to-port="1" />
2455
+ <edge from-layer="46" from-port="2" to-layer="47" to-port="0" />
2456
+ <edge from-layer="47" from-port="1" to-layer="50" to-port="0" />
2457
+ <edge from-layer="48" from-port="0" to-layer="49" to-port="0" />
2458
+ <edge from-layer="49" from-port="1" to-layer="50" to-port="1" />
2459
+ <edge from-layer="50" from-port="2" to-layer="53" to-port="0" />
2460
+ <edge from-layer="51" from-port="0" to-layer="52" to-port="0" />
2461
+ <edge from-layer="52" from-port="1" to-layer="53" to-port="1" />
2462
+ <edge from-layer="53" from-port="2" to-layer="56" to-port="0" />
2463
+ <edge from-layer="54" from-port="0" to-layer="55" to-port="0" />
2464
+ <edge from-layer="55" from-port="1" to-layer="56" to-port="1" />
2465
+ <edge from-layer="56" from-port="2" to-layer="57" to-port="0" />
2466
+ <edge from-layer="57" from-port="1" to-layer="60" to-port="0" />
2467
+ <edge from-layer="58" from-port="0" to-layer="59" to-port="0" />
2468
+ <edge from-layer="59" from-port="1" to-layer="60" to-port="1" />
2469
+ <edge from-layer="60" from-port="2" to-layer="63" to-port="0" />
2470
+ <edge from-layer="61" from-port="0" to-layer="62" to-port="0" />
2471
+ <edge from-layer="62" from-port="1" to-layer="63" to-port="1" />
2472
+ <edge from-layer="63" from-port="2" to-layer="66" to-port="0" />
2473
+ <edge from-layer="64" from-port="0" to-layer="65" to-port="0" />
2474
+ <edge from-layer="65" from-port="1" to-layer="66" to-port="1" />
2475
+ <edge from-layer="66" from-port="2" to-layer="67" to-port="0" />
2476
+ <edge from-layer="67" from-port="1" to-layer="70" to-port="0" />
2477
+ <edge from-layer="68" from-port="0" to-layer="69" to-port="0" />
2478
+ <edge from-layer="69" from-port="1" to-layer="70" to-port="1" />
2479
+ <edge from-layer="70" from-port="2" to-layer="73" to-port="0" />
2480
+ <edge from-layer="71" from-port="0" to-layer="72" to-port="0" />
2481
+ <edge from-layer="72" from-port="1" to-layer="73" to-port="1" />
2482
+ <edge from-layer="73" from-port="2" to-layer="76" to-port="0" />
2483
+ <edge from-layer="74" from-port="0" to-layer="75" to-port="0" />
2484
+ <edge from-layer="75" from-port="1" to-layer="76" to-port="1" />
2485
+ <edge from-layer="76" from-port="2" to-layer="77" to-port="0" />
2486
+ <edge from-layer="77" from-port="1" to-layer="80" to-port="0" />
2487
+ <edge from-layer="78" from-port="0" to-layer="79" to-port="0" />
2488
+ <edge from-layer="79" from-port="1" to-layer="80" to-port="1" />
2489
+ <edge from-layer="80" from-port="2" to-layer="83" to-port="0" />
2490
+ <edge from-layer="81" from-port="0" to-layer="82" to-port="0" />
2491
+ <edge from-layer="82" from-port="1" to-layer="83" to-port="1" />
2492
+ <edge from-layer="83" from-port="2" to-layer="86" to-port="0" />
2493
+ <edge from-layer="84" from-port="0" to-layer="85" to-port="0" />
2494
+ <edge from-layer="85" from-port="1" to-layer="86" to-port="1" />
2495
+ <edge from-layer="86" from-port="2" to-layer="87" to-port="0" />
2496
+ <edge from-layer="87" from-port="1" to-layer="90" to-port="0" />
2497
+ <edge from-layer="88" from-port="0" to-layer="89" to-port="0" />
2498
+ <edge from-layer="89" from-port="1" to-layer="90" to-port="1" />
2499
+ <edge from-layer="90" from-port="2" to-layer="93" to-port="0" />
2500
+ <edge from-layer="91" from-port="0" to-layer="92" to-port="0" />
2501
+ <edge from-layer="92" from-port="1" to-layer="93" to-port="1" />
2502
+ <edge from-layer="93" from-port="2" to-layer="96" to-port="0" />
2503
+ <edge from-layer="94" from-port="0" to-layer="95" to-port="0" />
2504
+ <edge from-layer="95" from-port="1" to-layer="96" to-port="1" />
2505
+ <edge from-layer="96" from-port="2" to-layer="97" to-port="0" />
2506
+ <edge from-layer="97" from-port="1" to-layer="100" to-port="0" />
2507
+ <edge from-layer="98" from-port="0" to-layer="99" to-port="0" />
2508
+ <edge from-layer="99" from-port="1" to-layer="100" to-port="1" />
2509
+ <edge from-layer="100" from-port="2" to-layer="103" to-port="0" />
2510
+ <edge from-layer="101" from-port="0" to-layer="102" to-port="0" />
2511
+ <edge from-layer="102" from-port="1" to-layer="103" to-port="1" />
2512
+ <edge from-layer="103" from-port="2" to-layer="106" to-port="0" />
2513
+ <edge from-layer="104" from-port="0" to-layer="105" to-port="0" />
2514
+ <edge from-layer="105" from-port="1" to-layer="106" to-port="1" />
2515
+ <edge from-layer="106" from-port="2" to-layer="107" to-port="0" />
2516
+ <edge from-layer="107" from-port="1" to-layer="110" to-port="0" />
2517
+ <edge from-layer="108" from-port="0" to-layer="109" to-port="0" />
2518
+ <edge from-layer="109" from-port="1" to-layer="110" to-port="1" />
2519
+ <edge from-layer="110" from-port="2" to-layer="113" to-port="0" />
2520
+ <edge from-layer="111" from-port="0" to-layer="112" to-port="0" />
2521
+ <edge from-layer="112" from-port="1" to-layer="113" to-port="1" />
2522
+ <edge from-layer="113" from-port="2" to-layer="114" to-port="0" />
2523
+ <edge from-layer="114" from-port="1" to-layer="116" to-port="0" />
2524
+ <edge from-layer="115" from-port="0" to-layer="116" to-port="1" />
2525
+ <edge from-layer="116" from-port="2" to-layer="118" to-port="0" />
2526
+ <edge from-layer="117" from-port="0" to-layer="118" to-port="1" />
2527
+ <edge from-layer="118" from-port="2" to-layer="121" to-port="0" />
2528
+ <edge from-layer="119" from-port="0" to-layer="120" to-port="0" />
2529
+ <edge from-layer="120" from-port="1" to-layer="121" to-port="1" />
2530
+ <edge from-layer="121" from-port="2" to-layer="124" to-port="0" />
2531
+ <edge from-layer="122" from-port="0" to-layer="123" to-port="0" />
2532
+ <edge from-layer="123" from-port="1" to-layer="124" to-port="1" />
2533
+ <edge from-layer="124" from-port="2" to-layer="125" to-port="0" />
2534
+ </edges>
2535
+ <rt_info>
2536
+ <Runtime_version value="2023.3.0-13775-ceeafaf64f3-releases/2023/3" />
2537
+ <conversion_parameters>
2538
+ <input_model value="DIR/output_model.onnx" />
2539
+ <is_python_object value="False" />
2540
+ </conversion_parameters>
2541
+ </rt_info>
2542
+ </net>