Upload Date | July 06 2024 09:53 PM |
Views | 2 |
System Information | |
---|---|
Operating System | Ubuntu 24.04 LTS |
Model | Supermicro X9DRE-TF+/X9DR7-TF+ |
Motherboard | Supermicro X9DRE-TF+/X9DR7-TF+ |
CPU Information | |
---|---|
Name | Intel Xeon E5-2680 v2 |
Topology | 2 Processors, 20 Cores, 40 Threads |
Identifier | GenuineIntel Family 6 Model 62 Stepping 4 |
Base Frequency | 3.60 GHz |
Cluster 1 | 0 Cores |
L1 Instruction Cache | 32.0 KB x 10 |
L1 Data Cache | 32.0 KB x 10 |
L2 Cache | 256 KB x 10 |
L3 Cache | 25.0 MB x 1 |
Memory Information | |
---|---|
Size | 125.84 GB |
Inference Information | |
---|---|
Framework | TensorFlow Lite |
Backend | CPU |
Device | Default |
Workload | Accuracy | Score | |
---|---|---|---|
Image Classification (F32)
|
100% |
61
11.3 IPS |
|
Image Classification (F16)
|
100% |
61
11.3 IPS |
|
Image Classification (I8)
|
97% |
46
8.55 IPS |
|
Image Segmentation (F32)
|
100% |
84
1.41 IPS |
|
Image Segmentation (F16)
|
100% |
84
1.40 IPS |
|
Image Segmentation (I8)
|
98% |
59
0.99 IPS |
|
Pose Estimation (F32)
|
100% |
99
0.12 IPS |
|
Pose Estimation (F16)
|
100% |
99
0.12 IPS |
|
Pose Estimation (I8)
|
100% |
80
0.10 IPS |
|
Object Detection (F32)
|
100% |
64
4.75 IPS |
|
Object Detection (F16)
|
100% |
63
4.73 IPS |
|
Object Detection (I8)
|
68% |
49
3.63 IPS |
|
Face Detection (F32)
|
100% |
197
2.35 IPS |
|
Face Detection (F16)
|
100% |
198
2.35 IPS |
|
Face Detection (I8)
|
87% |
145
1.73 IPS |
|
Depth Estimation (F32)
|
100% |
145
1.12 IPS |
|
Depth Estimation (F16)
|
100% |
144
1.12 IPS |
|
Depth Estimation (I8)
|
95% |
110
0.85 IPS |
|
Style Transfer (F32)
|
100% |
295
0.39 IPS |
|
Style Transfer (F16)
|
100% |
295
0.39 IPS |
|
Style Transfer (I8)
|
98% |
241
0.32 IPS |
|
Image Super-Resolution (F32)
|
100% |
68
2.45 IPS |
|
Image Super-Resolution (F16)
|
100% |
68
2.44 IPS |
|
Image Super-Resolution (I8)
|
98% |
51
1.82 IPS |
|
Text Classification (F32)
|
100% |
62
88.9 IPS |
|
Text Classification (F16)
|
100% |
61
88.2 IPS |
|
Text Classification (I8)
|
92% |
41
58.4 IPS |
|
Machine Translation (F32)
|
100% |
131
2.42 IPS |
|
Machine Translation (F16)
|
100% |
132
2.43 IPS |
|
Machine Translation (I8)
|
62% |
80
1.47 IPS |