From d4823258187e315eead5a1271907dcc9251cc080 Mon Sep 17 00:00:00 2001
From: DefTruth <31974251+DefTruth@users.noreply.github.com>
Date: Mon, 15 Aug 2022 14:21:06 +0800
Subject: [PATCH 01/14] update (#21)
* [feature][docs] add prebuilt windows python whl and cpp libs (#113)
[feature][docs] add windows python whl and cpp libs
* Add ernie ie task (#100)
* Modify API Result Docs (#112)
* first commit for yolov7
* pybind for yolov7
* CPP README.md
* CPP README.md
* modified yolov7.cc
* README.md
* python file modify
* delete license in fastdeploy/
* repush the conflict part
* README.md modified
* README.md modified
* file path modified
* file path modified
* file path modified
* file path modified
* file path modified
* README modified
* README modified
* move some helpers to private
* add examples for yolov7
* api.md modified
* api.md modified
* api.md modified
* YOLOv7
* yolov7 release link
* yolov7 release link
* yolov7 release link
* copyright
* change some helpers to private
* change variables to const and fix documents.
* gitignore
* Transfer some funtions to private member of class
* Transfer some funtions to private member of class
* Merge from develop (#9)
* Fix compile problem in different python version (#26)
* fix some usage problem in linux
* Fix compile problem
Co-authored-by: root
* Add PaddleDetetion/PPYOLOE model support (#22)
* add ppdet/ppyoloe
* Add demo code and documents
* add convert processor to vision (#27)
* update .gitignore
* Added checking for cmake include dir
* fixed missing trt_backend option bug when init from trt
* remove un-need data layout and add pre-check for dtype
* changed RGB2BRG to BGR2RGB in ppcls model
* add model_zoo yolov6 c++/python demo
* fixed CMakeLists.txt typos
* update yolov6 cpp/README.md
* add yolox c++/pybind and model_zoo demo
* move some helpers to private
* fixed CMakeLists.txt typos
* add normalize with alpha and beta
* add version notes for yolov5/yolov6/yolox
* add copyright to yolov5.cc
* revert normalize
* fixed some bugs in yolox
* fixed examples/CMakeLists.txt to avoid conflicts
* add convert processor to vision
* format examples/CMakeLists summary
* Fix bug while the inference result is empty with YOLOv5 (#29)
* Add multi-label function for yolov5
* Update README.md
Update doc
* Update fastdeploy_runtime.cc
fix variable option.trt_max_shape wrong name
* Update runtime_option.md
Update resnet model dynamic shape setting name from images to x
* Fix bug when inference result boxes are empty
* Delete detection.py
Co-authored-by: Jason
Co-authored-by: root
Co-authored-by: DefTruth <31974251+DefTruth@users.noreply.github.com>
Co-authored-by: huangjianhui <852142024@qq.com>
* first commit for yolor
* for merge
* Develop (#11)
* Fix compile problem in different python version (#26)
* fix some usage problem in linux
* Fix compile problem
Co-authored-by: root
* Add PaddleDetetion/PPYOLOE model support (#22)
* add ppdet/ppyoloe
* Add demo code and documents
* add convert processor to vision (#27)
* update .gitignore
* Added checking for cmake include dir
* fixed missing trt_backend option bug when init from trt
* remove un-need data layout and add pre-check for dtype
* changed RGB2BRG to BGR2RGB in ppcls model
* add model_zoo yolov6 c++/python demo
* fixed CMakeLists.txt typos
* update yolov6 cpp/README.md
* add yolox c++/pybind and model_zoo demo
* move some helpers to private
* fixed CMakeLists.txt typos
* add normalize with alpha and beta
* add version notes for yolov5/yolov6/yolox
* add copyright to yolov5.cc
* revert normalize
* fixed some bugs in yolox
* fixed examples/CMakeLists.txt to avoid conflicts
* add convert processor to vision
* format examples/CMakeLists summary
* Fix bug while the inference result is empty with YOLOv5 (#29)
* Add multi-label function for yolov5
* Update README.md
Update doc
* Update fastdeploy_runtime.cc
fix variable option.trt_max_shape wrong name
* Update runtime_option.md
Update resnet model dynamic shape setting name from images to x
* Fix bug when inference result boxes are empty
* Delete detection.py
Co-authored-by: Jason
Co-authored-by: root
Co-authored-by: DefTruth <31974251+DefTruth@users.noreply.github.com>
Co-authored-by: huangjianhui <852142024@qq.com>
* Yolor (#16)
* Develop (#11) (#12)
* Fix compile problem in different python version (#26)
* fix some usage problem in linux
* Fix compile problem
Co-authored-by: root
* Add PaddleDetetion/PPYOLOE model support (#22)
* add ppdet/ppyoloe
* Add demo code and documents
* add convert processor to vision (#27)
* update .gitignore
* Added checking for cmake include dir
* fixed missing trt_backend option bug when init from trt
* remove un-need data layout and add pre-check for dtype
* changed RGB2BRG to BGR2RGB in ppcls model
* add model_zoo yolov6 c++/python demo
* fixed CMakeLists.txt typos
* update yolov6 cpp/README.md
* add yolox c++/pybind and model_zoo demo
* move some helpers to private
* fixed CMakeLists.txt typos
* add normalize with alpha and beta
* add version notes for yolov5/yolov6/yolox
* add copyright to yolov5.cc
* revert normalize
* fixed some bugs in yolox
* fixed examples/CMakeLists.txt to avoid conflicts
* add convert processor to vision
* format examples/CMakeLists summary
* Fix bug while the inference result is empty with YOLOv5 (#29)
* Add multi-label function for yolov5
* Update README.md
Update doc
* Update fastdeploy_runtime.cc
fix variable option.trt_max_shape wrong name
* Update runtime_option.md
Update resnet model dynamic shape setting name from images to x
* Fix bug when inference result boxes are empty
* Delete detection.py
Co-authored-by: Jason
Co-authored-by: root
Co-authored-by: DefTruth <31974251+DefTruth@users.noreply.github.com>
Co-authored-by: huangjianhui <852142024@qq.com>
Co-authored-by: Jason
Co-authored-by: root
Co-authored-by: DefTruth <31974251+DefTruth@users.noreply.github.com>
Co-authored-by: huangjianhui <852142024@qq.com>
* Develop (#13)
* Fix compile problem in different python version (#26)
* fix some usage problem in linux
* Fix compile problem
Co-authored-by: root
* Add PaddleDetetion/PPYOLOE model support (#22)
* add ppdet/ppyoloe
* Add demo code and documents
* add convert processor to vision (#27)
* update .gitignore
* Added checking for cmake include dir
* fixed missing trt_backend option bug when init from trt
* remove un-need data layout and add pre-check for dtype
* changed RGB2BRG to BGR2RGB in ppcls model
* add model_zoo yolov6 c++/python demo
* fixed CMakeLists.txt typos
* update yolov6 cpp/README.md
* add yolox c++/pybind and model_zoo demo
* move some helpers to private
* fixed CMakeLists.txt typos
* add normalize with alpha and beta
* add version notes for yolov5/yolov6/yolox
* add copyright to yolov5.cc
* revert normalize
* fixed some bugs in yolox
* fixed examples/CMakeLists.txt to avoid conflicts
* add convert processor to vision
* format examples/CMakeLists summary
* Fix bug while the inference result is empty with YOLOv5 (#29)
* Add multi-label function for yolov5
* Update README.md
Update doc
* Update fastdeploy_runtime.cc
fix variable option.trt_max_shape wrong name
* Update runtime_option.md
Update resnet model dynamic shape setting name from images to x
* Fix bug when inference result boxes are empty
* Delete detection.py
Co-authored-by: Jason
Co-authored-by: root
Co-authored-by: DefTruth <31974251+DefTruth@users.noreply.github.com>
Co-authored-by: huangjianhui <852142024@qq.com>
* documents
* documents
* documents
* documents
* documents
* documents
* documents
* documents
* documents
* documents
* documents
* documents
* Develop (#14)
* Fix compile problem in different python version (#26)
* fix some usage problem in linux
* Fix compile problem
Co-authored-by: root
* Add PaddleDetetion/PPYOLOE model support (#22)
* add ppdet/ppyoloe
* Add demo code and documents
* add convert processor to vision (#27)
* update .gitignore
* Added checking for cmake include dir
* fixed missing trt_backend option bug when init from trt
* remove un-need data layout and add pre-check for dtype
* changed RGB2BRG to BGR2RGB in ppcls model
* add model_zoo yolov6 c++/python demo
* fixed CMakeLists.txt typos
* update yolov6 cpp/README.md
* add yolox c++/pybind and model_zoo demo
* move some helpers to private
* fixed CMakeLists.txt typos
* add normalize with alpha and beta
* add version notes for yolov5/yolov6/yolox
* add copyright to yolov5.cc
* revert normalize
* fixed some bugs in yolox
* fixed examples/CMakeLists.txt to avoid conflicts
* add convert processor to vision
* format examples/CMakeLists summary
* Fix bug while the inference result is empty with YOLOv5 (#29)
* Add multi-label function for yolov5
* Update README.md
Update doc
* Update fastdeploy_runtime.cc
fix variable option.trt_max_shape wrong name
* Update runtime_option.md
Update resnet model dynamic shape setting name from images to x
* Fix bug when inference result boxes are empty
* Delete detection.py
Co-authored-by: root
Co-authored-by: DefTruth <31974251+DefTruth@users.noreply.github.com>
Co-authored-by: huangjianhui <852142024@qq.com>
Co-authored-by: Jason
Co-authored-by: root
Co-authored-by: DefTruth <31974251+DefTruth@users.noreply.github.com>
Co-authored-by: huangjianhui <852142024@qq.com>
Co-authored-by: Jason <928090362@qq.com>
* add is_dynamic for YOLO series (#22)
* first commit test photo
* yolov7 doc
* yolov7 doc
* yolov7 doc
* yolov7 doc
* add yolov5 docs
* modify yolov5 doc
* first commit for retinaface
* first commit for retinaface
* firt commit for ultraface
* firt commit for ultraface
* firt commit for yolov5face
* firt commit for modnet and arcface
* firt commit for modnet and arcface
* first commit for partial_fc
* first commit for partial_fc
* first commit for yolox
* first commit for yolov6
* first commit for nano_det
* first commit for scrfd
* first commit for scrfd
* first commit for retinaface
* first commit for ultraface
* first commit for yolov5face
* first commit for yolox yolov6 nano
* rm jpg
* first commit for modnet and modify nano
* yolor scaledyolov4 v5lite
* first commit for insightface
* first commit for insightface
* first commit for insightface
* docs
* docs
* docs
* docs
* docs
* add print for detect and modify docs
* docs
* docs
* docs
* docs test for insightface
* docs test for insightface again
* docs test for insightface
* modify all wrong expressions in docs
* modify all wrong expressions in docs
* modify all wrong expressions in docs
* modify all wrong expressions in docs
* modify docs expressions
* fix expression of detection part
* fix expression of detection part
* fix expression of detection part
* add face recognition result doc
* modify result docs
* modify result docs
* modify result docs
Co-authored-by: Jason
Co-authored-by: root
Co-authored-by: DefTruth <31974251+DefTruth@users.noreply.github.com>
Co-authored-by: huangjianhui <852142024@qq.com>
Co-authored-by: Jason <928090362@qq.com>
Co-authored-by: Jack Zhou
Co-authored-by: ziqi-jin <67993288+ziqi-jin@users.noreply.github.com>
Co-authored-by: Jason
Co-authored-by: root
Co-authored-by: huangjianhui <852142024@qq.com>
Co-authored-by: Jason <928090362@qq.com>
---
FastDeploy.cmake.in | 4 +
csrc/fastdeploy/core/config.h.in | 8 +
docs/api/vision_results/README.md | 5 +-
.../vision_results/classification_result.md | 4 +-
docs/api/vision_results/detection_result.md | 6 +-
.../vision_results/face_detection_result.md | 8 +-
.../vision_results/face_recognition_result.md | 24 +++
docs/api/vision_results/matting_result.md | 10 +-
docs/compile/prebuilt_libraries.md | 10 +-
docs/compile/prebuilt_wheels.md | 11 +-
.../ernie/cpp/CMakeLists.txt | 25 +++
.../information_extraction/ernie/cpp/infer.cc | 182 ++++++++++++++++++
examples/vision/README.md | 1 +
13 files changed, 274 insertions(+), 24 deletions(-)
create mode 100644 docs/api/vision_results/face_recognition_result.md
create mode 100644 examples/text/information_extraction/ernie/cpp/CMakeLists.txt
create mode 100644 examples/text/information_extraction/ernie/cpp/infer.cc
diff --git a/FastDeploy.cmake.in b/FastDeploy.cmake.in
index 082fa30f30b..ccd1039be29 100644
--- a/FastDeploy.cmake.in
+++ b/FastDeploy.cmake.in
@@ -95,6 +95,10 @@ endif()
if (ENABLE_TEXT)
# Add dependency libs later
+ find_library(FASTER_TOKENIZER_LIB core_tokenizers ${CMAKE_CURRENT_LIST_DIR}/third_libs/install/faster_tokenizer/lib NO_DEFAULT_PATH)
+ list(APPEND FASTDEPLOY_LIBS ${FASTER_TOKENIZER_LIB})
+ list(APPEND FASTDEPLOY_INCS ${CMAKE_CURRENT_LIST_DIR}/third_libs/install/faster_tokenizer/include)
+ list(APPEND FASTDEPLOY_INCS ${CMAKE_CURRENT_LIST_DIR}/third_libs/install/faster_tokenizer/third_party/include)
endif()
if(ENABLE_PADDLE_FRONTEND)
diff --git a/csrc/fastdeploy/core/config.h.in b/csrc/fastdeploy/core/config.h.in
index 7713925867a..b29113f1fdb 100644
--- a/csrc/fastdeploy/core/config.h.in
+++ b/csrc/fastdeploy/core/config.h.in
@@ -45,6 +45,10 @@
#cmakedefine ENABLE_VISION
#endif
+#ifndef ENABLE_TEXT
+#cmakedefine ENABLE_TEXT
+#endif
+
#ifndef ENABLE_OPENCV_CUDA
#cmakedefine ENABLE_OPENCV_CUDA
#endif
@@ -52,3 +56,7 @@
#ifndef ENABLE_VISION_VISUALIZE
#cmakedefine ENABLE_VISION_VISUALIZE
#endif
+
+#ifndef ENABLE_FDTENSOR_FUNC
+#cmakedefine ENABLE_FDTENSOR_FUNC
+#endif
diff --git a/docs/api/vision_results/README.md b/docs/api/vision_results/README.md
index 844388cca86..64ea4fc671b 100644
--- a/docs/api/vision_results/README.md
+++ b/docs/api/vision_results/README.md
@@ -6,5 +6,6 @@ FastDeploy根据视觉模型的任务类型,定义了不同的结构体(`csrcs
| :----- | :--- | :---- | :------- |
| ClassificationResult | [C++/Python文档](./classification_result.md) | 图像分类返回结果 | ResNet50、MobileNetV3等 |
| DetectionResult | [C++/Python文档](./detection_result.md) | 目标检测返回结果 | PPYOLOE、YOLOv7系列模型等 |
-| FaceDetectionResult | [C++/Python文档](./face_detection_result.md) | 目标检测返回结果 | PPYOLOE、YOLOv7系列模型等 |
-| MattingResult | [C++/Python文档](./matting_result.md) | 目标检测返回结果 | PPYOLOE、YOLOv7系列模型等 |
+| FaceDetectionResult | [C++/Python文档](./face_detection_result.md) | 目标检测返回结果 | SCRFD、RetinaFace系列模型等 |
+| FaceRecognitionResult | [C++/Python文档](./face_recognition_result.md) | 目标检测返回结果 | ArcFace、CosFace系列模型等 |
+| MattingResult | [C++/Python文档](./matting_result.md) | 目标检测返回结果 | MODNet系列模型等 |
diff --git a/docs/api/vision_results/classification_result.md b/docs/api/vision_results/classification_result.md
index 113db39608a..bf94d0ff159 100644
--- a/docs/api/vision_results/classification_result.md
+++ b/docs/api/vision_results/classification_result.md
@@ -2,7 +2,7 @@
ClassifyResult代码定义在`csrcs/fastdeploy/vision/common/result.h`中,用于表明图像的分类结果和置信度。
-## C++ 结构体
+## C++ 定义
`fastdeploy::vision::ClassifyResult`
@@ -20,7 +20,7 @@ struct ClassifyResult {
- **Clear()**: 成员函数,用于清除结构体中存储的结果
- **Str()**: 成员函数,将结构体中的信息以字符串形式输出(用于Debug)
-## Python结构体
+## Python 定义
`fastdeploy.vision.ClassifyResult`
diff --git a/docs/api/vision_results/detection_result.md b/docs/api/vision_results/detection_result.md
index e44a27b34c3..a702d49899f 100644
--- a/docs/api/vision_results/detection_result.md
+++ b/docs/api/vision_results/detection_result.md
@@ -2,7 +2,7 @@
DetectionResult代码定义在`csrcs/fastdeploy/vision/common/result.h`中,用于表明图像检测出来的目标框、目标类别和目标置信度。
-## C++ 结构体
+## C++ 定义
`fastdeploy::vision::DetectionResult`
@@ -22,10 +22,10 @@ struct DetectionResult {
- **Clear()**: 成员函数,用于清除结构体中存储的结果
- **Str()**: 成员函数,将结构体中的信息以字符串形式输出(用于Debug)
-## Python结构体
+## Python 定义
`fastdeploy.vision.DetectionResult`
- **boxes**(list of list(float)): 成员变量,表示单张图片检测出来的所有目标框坐标。boxes是一个list,其每个元素为一个长度为4的list, 表示为一个框,每个框以4个float数值依次表示xmin, ymin, xmax, ymax, 即左上角和右下角坐标
- **scores**(list of float): 成员变量,表示单张图片检测出来的所有目标置信度
-- **label_ids(list of int): 成员变量,表示单张图片检测出来的所有目标类别
+- **label_ids**(list of int): 成员变量,表示单张图片检测出来的所有目标类别
diff --git a/docs/api/vision_results/face_detection_result.md b/docs/api/vision_results/face_detection_result.md
index 6c9c09f0073..000b42a6be0 100644
--- a/docs/api/vision_results/face_detection_result.md
+++ b/docs/api/vision_results/face_detection_result.md
@@ -1,8 +1,8 @@
# FaceDetectionResult 人脸检测结果
-FaceDetectionResult 代码定义在`csrcs/fastdeploy/vision/common/result.h`中,用于表明图像检测出来的目标框、目标类别和目标置信度。
+FaceDetectionResult 代码定义在`csrcs/fastdeploy/vision/common/result.h`中,用于表明人脸检测出来的目标框、人脸landmarks,目标置信度和每张人脸的landmark数量。
-## C++ 结构体
+## C++ 定义
`fastdeploy::vision::FaceDetectionResult`
@@ -11,7 +11,6 @@ struct FaceDetectionResult {
std::vector> boxes;
std::vector> landmarks;
std::vector scores;
- ResultType type = ResultType::FACE_DETECTION;
int landmarks_per_face;
void Clear();
std::string Str();
@@ -25,10 +24,11 @@ struct FaceDetectionResult {
- **Clear()**: 成员函数,用于清除结构体中存储的结果
- **Str()**: 成员函数,将结构体中的信息以字符串形式输出(用于Debug)
-## Python结构体
+## Python 定义
`fastdeploy.vision.FaceDetectionResult`
- **boxes**(list of list(float)): 成员变量,表示单张图片检测出来的所有目标框坐标。boxes是一个list,其每个元素为一个长度为4的list, 表示为一个框,每个框以4个float数值依次表示xmin, ymin, xmax, ymax, 即左上角和右下角坐标
- **scores**(list of float): 成员变量,表示单张图片检测出来的所有目标置信度
- **landmarks**: 成员变量,表示单张图片检测出来的所有人脸的关键点
+- **landmarks_per_face**: 成员变量,表示每个人脸框中的关键点的数量。
diff --git a/docs/api/vision_results/face_recognition_result.md b/docs/api/vision_results/face_recognition_result.md
new file mode 100644
index 00000000000..83160561843
--- /dev/null
+++ b/docs/api/vision_results/face_recognition_result.md
@@ -0,0 +1,24 @@
+# FaceRecognitionResult 人脸识别结果
+
+FaceRecognitionResult 代码定义在`csrcs/fastdeploy/vision/common/result.h`中,用于表明人脸识别模型对图像特征的embedding。
+## C++ 定义
+
+`fastdeploy::vision::FaceRecognitionResult`
+
+```
+struct FaceRecognitionResult {
+ std::vector embedding;
+ void Clear();
+ std::string Str();
+};
+```
+
+- **embedding**: 成员变量,表示人脸识别模型最终的提取的特征embedding,可以用来计算人脸之间的特征相似度。
+- **Clear()**: 成员函数,用于清除结构体中存储的结果
+- **Str()**: 成员函数,将结构体中的信息以字符串形式输出(用于Debug)
+
+## Python 定义
+
+`fastdeploy.vision.FaceRecognitionResult`
+
+- **embedding**: 成员变量,表示人脸识别模型最终提取的特征embedding,可以用来计算人脸之间的特征相似度。
diff --git a/docs/api/vision_results/matting_result.md b/docs/api/vision_results/matting_result.md
index 3418400ecaa..67bcbc79d21 100644
--- a/docs/api/vision_results/matting_result.md
+++ b/docs/api/vision_results/matting_result.md
@@ -1,15 +1,15 @@
# MattingResult 抠图结果
-MattingResult 代码定义在`csrcs/fastdeploy/vision/common/result.h`中,用于表明图像检测出来的目标框、目标类别和目标置信度。
+MattingResult 代码定义在`csrcs/fastdeploy/vision/common/result.h`中,用于表明模型预测的alpha透明度的值,预测的前景等。
-## C++ 结构体
+## C++ 定义
`fastdeploy::vision::MattingResult`
```
struct MattingResult {
- std::vector alpha; // h x w
- std::vector foreground; // h x w x c (c=3 default)
+ std::vector alpha;
+ std::vector foreground;
std::vector shape;
bool contain_foreground = false;
void Clear();
@@ -25,7 +25,7 @@ struct MattingResult {
- **Str()**: 成员函数,将结构体中的信息以字符串形式输出(用于Debug)
-## Python结构体
+## Python 定义
`fastdeploy.vision.MattingResult`
diff --git a/docs/compile/prebuilt_libraries.md b/docs/compile/prebuilt_libraries.md
index 6cec3721acd..bd58fc4b0b8 100644
--- a/docs/compile/prebuilt_libraries.md
+++ b/docs/compile/prebuilt_libraries.md
@@ -19,17 +19,17 @@ FastDeploy提供了在Windows/Linux/Mac上的预先编译CPP部署库,开发
### Windows 10 x64平台
-| 部署库下载地址 | 硬件 |
-| :------------- | :--- |
-| [comming...] | CPU |
-| [comming...] | CPU/GPU |
+| 部署库下载地址 | 硬件 | 说明 |
+| :------------- | :--- | :--- |
+| [fastdeploy-win-x64-0.2.0](https://bj.bcebos.com/paddlehub/fastdeploy/cpp/fastdeploy-win-x64-0.2.0.zip) | CPU | Visual Studio 16 2019 编译产出 |
+| [fastdeploy-win-x64-gpu-0.2.0](https://bj.bcebos.com/paddlehub/fastdeploy/cpp/fastdeploy-win-x64-gpu-0.2.0.zip) | CPU/GPU | Visual Studio 16 2019,cuda 11.2, cudnn 8.2编译产出 |
### Linux aarch64平台
| 安装包 | 硬件 |
| :---- | :-- |
| [comming...] | CPU |
-| [comming...] | Jetson |
+| [comming...] | Jetson |
### Mac OSX平台
diff --git a/docs/compile/prebuilt_wheels.md b/docs/compile/prebuilt_wheels.md
index 14ba7d40044..e3ada892e11 100644
--- a/docs/compile/prebuilt_wheels.md
+++ b/docs/compile/prebuilt_wheels.md
@@ -38,15 +38,20 @@ python -m pip install fastdeploy_python-0.2.0-cp38-cp38-manylinux1_x86_64.whl
| CPU 安装包 | 硬件 | Python版本 |
| :---- | :-- | :------ |
-| [comming...] | CPU | 3.8 |
-| [comming...] | CPU | 3.9 |
+| [fastdeploy_python-0.2.0-cp38-cp38-win_amd64.whl](https://bj.bcebos.com/paddlehub/fastdeploy/wheels/fastdeploy_python-0.2.0-cp38-cp38-win_amd64.whl) | CPU | 3.8 |
+| [fastdeploy_python-0.2.0-cp39-cp39-win_amd64.whl](https://bj.bcebos.com/paddlehub/fastdeploy/wheels/fastdeploy_python-0.2.0-cp39-cp39-win_amd64.whl) | CPU | 3.9 |
+
+| GPU 安装包 | 硬件 | Python版本 |
+| :---- | :-- | :------ |
+| [fastdeploy_gpu_python-0.2.0-cp38-cp38-win_amd64.whl](https://bj.bcebos.com/paddlehub/fastdeploy/wheels/fastdeploy_gpu_python-0.2.0-cp38-cp38-win_amd64.whl) | CPU/GPU | 3.8 |
+| [fastdeploy_gpu_python-0.2.0-cp39-cp39-win_amd64.whl](https://bj.bcebos.com/paddlehub/fastdeploy/wheels/fastdeploy_gpu_python-0.2.0-cp39-cp39-win_amd64.whl) | CPU/GPU | 3.9 |
### Linux aarch64平台
| 安装包 | 硬件 | Python版本 |
| :---- | :-- | :------ |
| [comming...] | CPU | 3.7 |
-| [comming...] | CPU | 3.8 |
+| [comming...] | CPU | 3.8 |
| [comming...] | CPU | 3.9 |
### Mac OSX平台
diff --git a/examples/text/information_extraction/ernie/cpp/CMakeLists.txt b/examples/text/information_extraction/ernie/cpp/CMakeLists.txt
new file mode 100644
index 00000000000..1189820cb79
--- /dev/null
+++ b/examples/text/information_extraction/ernie/cpp/CMakeLists.txt
@@ -0,0 +1,25 @@
+# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+PROJECT(infer_demo C CXX)
+CMAKE_MINIMUM_REQUIRED (VERSION 3.12)
+
+option(FASTDEPLOY_INSTALL_DIR "Path of downloaded fastdeploy sdk.")
+
+include(${FASTDEPLOY_INSTALL_DIR}/FastDeploy.cmake)
+
+include_directories(${FASTDEPLOY_INCS})
+
+add_executable(infer_ernie_demo ${PROJECT_SOURCE_DIR}/infer.cc)
+target_link_libraries(infer_ernie_demo ${FASTDEPLOY_LIBS})
diff --git a/examples/text/information_extraction/ernie/cpp/infer.cc b/examples/text/information_extraction/ernie/cpp/infer.cc
new file mode 100644
index 00000000000..7f3b9318664
--- /dev/null
+++ b/examples/text/information_extraction/ernie/cpp/infer.cc
@@ -0,0 +1,182 @@
+// Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+#include
+#include
+
+#include "fastdeploy/function/reduce.h"
+#include "fastdeploy/function/softmax.h"
+#include "fastdeploy/text.h"
+#include "tokenizers/ernie_faster_tokenizer.h"
+
+using namespace paddlenlp;
+
+void LoadTransitionFromFile(const std::string& file,
+ std::vector* transitions, int* num_tags) {
+ std::ifstream fin(file);
+ std::string curr_transition;
+ float transition;
+ int i = 0;
+ while (fin) {
+ std::getline(fin, curr_transition);
+ std::istringstream iss(curr_transition);
+ while (iss) {
+ iss >> transition;
+ transitions->push_back(transition);
+ }
+ if (curr_transition != "") {
+ ++i;
+ }
+ }
+ *num_tags = i;
+}
+
+template
+void ViterbiDecode(const fastdeploy::FDTensor& slot_logits,
+ const fastdeploy::FDTensor& trans,
+ fastdeploy::FDTensor* best_path) {
+ int batch_size = slot_logits.shape[0];
+ int seq_len = slot_logits.shape[1];
+ int num_tags = slot_logits.shape[2];
+ best_path->Allocate({batch_size, seq_len}, fastdeploy::FDDataType::INT64);
+
+ const T* slot_logits_ptr = reinterpret_cast(slot_logits.Data());
+ const T* trans_ptr = reinterpret_cast(trans.Data());
+ int64_t* best_path_ptr = reinterpret_cast(best_path->Data());
+ std::vector scores(num_tags);
+ std::copy(slot_logits_ptr, slot_logits_ptr + num_tags, scores.begin());
+ std::vector> M(num_tags, std::vector(num_tags));
+ for (int b = 0; b < batch_size; ++b) {
+ std::vector> paths;
+ const T* curr_slot_logits_ptr = slot_logits_ptr + b * seq_len * num_tags;
+ int64_t* curr_best_path_ptr = best_path_ptr + b * seq_len;
+ for (int t = 1; t < seq_len; t++) {
+ for (size_t i = 0; i < num_tags; i++) {
+ for (size_t j = 0; j < num_tags; j++) {
+ auto trans_idx = i * num_tags * num_tags + j * num_tags;
+ auto slot_logit_idx = t * num_tags + j;
+ M[i][j] = scores[i] + trans_ptr[trans_idx] +
+ curr_slot_logits_ptr[slot_logit_idx];
+ }
+ }
+ std::vector idxs;
+ for (size_t i = 0; i < num_tags; i++) {
+ T max = 0.0f;
+ int idx = 0;
+ for (size_t j = 0; j < num_tags; j++) {
+ if (M[j][i] > max) {
+ max = M[j][i];
+ idx = j;
+ }
+ }
+ scores[i] = max;
+ idxs.push_back(idx);
+ }
+ paths.push_back(idxs);
+ }
+ int scores_max_index = 0;
+ float scores_max = 0.0f;
+ for (size_t i = 0; i < scores.size(); i++) {
+ if (scores[i] > scores_max) {
+ scores_max = scores[i];
+ scores_max_index = i;
+ }
+ }
+ curr_best_path_ptr[seq_len - 1] = scores_max_index;
+ for (int i = seq_len - 2; i >= 0; i--) {
+ int index = curr_best_path_ptr[i + 1];
+ curr_best_path_ptr[i] = paths[i][index];
+ }
+ }
+}
+
+int main() {
+ // 1. Define a ernie faster tokenizer
+ faster_tokenizer::tokenizers_impl::ErnieFasterTokenizer tokenizer(
+ "ernie_vocab.txt");
+ std::vector strings_list = {
+ "导航去科技园二号楼", "屏幕亮度为我减小一点吧"};
+ std::vector encodings;
+ tokenizer.EncodeBatchStrings(strings_list, &encodings);
+ size_t batch_size = strings_list.size();
+ size_t seq_len = encodings[0].GetLen();
+ for (auto&& encoding : encodings) {
+ std::cout << encoding.DebugString() << std::endl;
+ }
+ // 2. Initialize runtime
+ fastdeploy::RuntimeOption runtime_option;
+ runtime_option.SetModelPath("nano_static/model.pdmodel",
+ "nano_static/model.pdiparams");
+ fastdeploy::Runtime runtime;
+ runtime.Init(runtime_option);
+
+ // 3. Construct input vector
+ // 3.1 Convert encodings to input_ids, token_type_ids
+ std::vector input_ids, token_type_ids;
+ for (int i = 0; i < encodings.size(); ++i) {
+ auto&& curr_input_ids = encodings[i].GetIds();
+ auto&& curr_type_ids = encodings[i].GetTypeIds();
+ input_ids.insert(input_ids.end(), curr_input_ids.begin(),
+ curr_input_ids.end());
+ token_type_ids.insert(token_type_ids.end(), curr_type_ids.begin(),
+ curr_type_ids.end());
+ }
+ // 3.2 Set data to input vector
+ std::vector inputs(runtime.NumInputs());
+ void* inputs_ptrs[] = {input_ids.data(), token_type_ids.data()};
+ for (int i = 0; i < runtime.NumInputs(); ++i) {
+ inputs[i].SetExternalData({batch_size, seq_len},
+ fastdeploy::FDDataType::INT64, inputs_ptrs[i]);
+ inputs[i].name = runtime.GetInputInfo(i).name;
+ }
+
+ // 4. Infer
+ std::vector outputs(runtime.NumOutputs());
+ runtime.Infer(inputs, &outputs);
+
+ // 5. Postprocess
+ fastdeploy::FDTensor domain_probs, intent_probs;
+ fastdeploy::Softmax(outputs[0], &domain_probs);
+ fastdeploy::Softmax(outputs[1], &intent_probs);
+
+ fastdeploy::FDTensor domain_max_probs, intent_max_probs;
+ fastdeploy::Max(domain_probs, &domain_max_probs, {-1}, true);
+ fastdeploy::Max(intent_probs, &intent_max_probs, {-1}, true);
+
+ std::vector transition;
+ int num_tags;
+ LoadTransitionFromFile("joint_transition.txt", &transition, &num_tags);
+ fastdeploy::FDTensor trans;
+ trans.SetExternalData({num_tags, num_tags}, fastdeploy::FDDataType::FP32,
+ transition.data());
+
+ fastdeploy::FDTensor best_path;
+ ViterbiDecode(outputs[2], trans, &best_path);
+ // 6. Print result
+ domain_max_probs.PrintInfo();
+ intent_max_probs.PrintInfo();
+
+ batch_size = best_path.shape[0];
+ seq_len = best_path.shape[1];
+ const int64_t* best_path_ptr =
+ reinterpret_cast(best_path.Data());
+ for (int i = 0; i < batch_size; ++i) {
+ std::cout << "best_path[" << i << "] = ";
+ for (int j = 0; j < seq_len; ++j) {
+ std::cout << best_path_ptr[i * seq_len + j] << ", ";
+ }
+ std::cout << std::endl;
+ }
+ best_path.PrintInfo();
+ return 0;
+}
diff --git a/examples/vision/README.md b/examples/vision/README.md
index 9f05d2d7f6d..d95a315d798 100644
--- a/examples/vision/README.md
+++ b/examples/vision/README.md
@@ -8,6 +8,7 @@
| Segmentation | 语义分割,输入图像,给出图像中每个像素的分类及置信度 | [SegmentationResult](../../docs/api/vision_results/segmentation_result.md) |
| Classification | 图像分类,输入图像,给出图像的分类结果和置信度 | [ClassifyResult](../../docs/api/vision_results/classification_result.md) |
| FaceDetection | 人脸检测,输入图像,检测图像中人脸位置,并返回检测框坐标及人脸关键点 | [FaceDetectionResult](../../docs/api/vision_results/face_detection_result.md) |
+| FaceRecognition | 人脸识别,输入图像,返回可用于相似度计算的人脸特征的embedding | [FaceRecognitionResult](../../docs/api/vision_results/face_recognition_result.md) |
| Matting | 抠图,输入图像,返回图片的前景每个像素点的Alpha值 | [MattingResult](../../docs/api/vision_results/matting_result.md) |
## FastDeploy API设计
From 8d3e7d4d1929a9e23f70f7bc546be8b3fa1b753c Mon Sep 17 00:00:00 2001
From: DefTruth
Date: Mon, 15 Aug 2022 08:51:26 +0000
Subject: [PATCH 02/14] [feature][docs] update README.md and logo
---
README.md | 261 ++++++++++++++++++-----------
docs/compile/prebuilt_libraries.md | 2 +-
docs/logo/fastdeploy-logo.png | Bin 0 -> 133793 bytes
3 files changed, 167 insertions(+), 96 deletions(-)
create mode 100644 docs/logo/fastdeploy-logo.png
diff --git a/README.md b/README.md
index 5619db16009..77256586273 100644
--- a/README.md
+++ b/README.md
@@ -1,9 +1,7 @@
-# ⚡️FastDeploy
+
-------------------------------------------------------------------------------------------
-
@@ -18,75 +16,142 @@
**⚡️FastDeploy**是一款**简单易用**的推理部署工具箱。覆盖业界主流**优质预训练模型**并提供**开箱即用**的开发体验,包括图像分类、目标检测、图像分割、人脸检测、人体关键点识别、文字识别等多任务,满足开发者**多场景**,**多硬件**、**多平台**的快速部署需求。
-## 发版历史
+## 0. 发版历史
- [v0.2.0] 2022.08.18 全面开源服务端部署代码,支持40+视觉模型在CPU/GPU,以及通过GPU TensorRT加速部署
-## 服务端模型
-
-| 任务场景 | 模型 | CPU | NVIDIA GPU | TensorRT |
-| -------- | ------------------------------------------------------------ | ------- | ---------- | ------------------- |
-| 图像分类 | [PaddleClas/ResNet50](./examples/vision/classification/paddleclas) | √ | √ | √ |
-| | [PaddleClas/PPLCNet](./examples/vision/classification/paddleclas) | √ | √ | √ |
-| | [PaddleClas/EfficientNet](./examples/vision/classification/paddleclas) | √ | √ | √ |
-| | [PaddleClas/GhostNet](./examples/vision/classification/paddleclas) | √ | √ | √ |
-| | [PaddleClas/MobileNetV1](./examples/vision/classification/paddleclas) | √ | √ | √ |
-| | [PaddleClas/MobileNetV2](./examples/vision/classification/paddleclas) | √ | √ | √ |
-| | [PaddleClas/ShuffleNetV2](./examples/vision/classification/paddleclas) | √ | √ | √ |
-| 目标检测 | [PaddleDetection/PPYOLOE](./examples/vision/detection/paddledetection) | √ | √ | √ |
-| | [PaddleDetection/PicoDet](./examples/vision/detection/paddledetection) | √ | √ | √ |
-| | [PaddleDetection/YOLOX](./examples/vision/detection/paddledetection) | √ | √ | √ |
-| | [PaddleDetection/YOLOv3](./examples/vision/detection/paddledetection) | √ | √ | √ |
-| | [PaddleDetection/PPYOLO](./examples/vision/detection/paddledetection) | √ | √ | - |
-| | [PaddleDetection/PPYOLOv2](./examples/vision/detection/paddledetection) | √ | √ | - |
-| | [PaddleDetection/FasterRCNN](./examples/vision/detection/paddledetection) | √ | √ | - |
-| | [WongKinYiu/YOLOv7](./examples/vision/detection/yolov7) | √ | √ | √ |
-
-## 快速开始
-
-#### 安装FastDeploy Python
-
+## 1. 内容目录
+* [服务端模型列表](#fastdeploy-server-models)
+* [服务端快速开始](#fastdeploy-quick-start)
+ * [Python预测示例](#fastdeploy-quick-start-python)
+ * [C++预测示例](#fastdeploy-quick-start-cpp)
+* [更多服务端部署示例](#fastdeploy-server-cases)
+* [轻量化SDK快速实现端侧AI推理部署](#fastdeploy-edge-sdk)
+ * [边缘侧部署](#fastdeploy-edge-sdk-arm-linux)
+ * [移动端部署](#fastdeploy-edge-sdk-ios-android)
+ * [自定义模型部署](#fastdeploy-edge-sdk-custom)
+* [社区交流](#fastdeploy-community)
+* [Acknowledge](#fastdeploy-acknowledge)
+* [License](#fastdeploy-license)
+## 2. 服务端模型列表
+
+
+
+符号说明: (1) √: 已经支持, (2) ?: 待详细测试, (3) -: 暂不支持, (4) contrib: 非飞桨生态模型
+| 任务场景 | 模型 | API | CPU | NVIDIA GPU | Paddle Inference | TensorRT | ORT |
+| -------- | ------------------------------------------------------------ | ------- | ------- | ---------- | ---------| ---------| ---------|
+| 图像分类 | [PaddleClas/ResNet50](./examples/vision/classification/paddleclas) | [Python](./examples/vision/classification/paddleclas/python)/[C++](./examples/vision/classification/paddleclas/cpp) | √ | √ | √ | √ | ? |
+| 图像分类 | [PaddleClas/PPLCNet](./examples/vision/classification/paddleclas) | [Python](./examples/vision/classification/paddleclas/python)/[C++](./examples/vision/classification/paddleclas/cpp) | √ | √ | √ | √ | ? |
+| 图像分类 | [PaddleClas/PPLCNetv2](./examples/vision/classification/paddleclas) | [Python](./examples/vision/classification/paddleclas/python)/[C++](./examples/vision/classification/paddleclas/cpp) | √ | √ | √ | √ | ? |
+| 图像分类 | [PaddleClas/EfficientNet](./examples/vision/classification/paddleclas) | [Python](./examples/vision/classification/paddleclas/python)/[C++](./examples/vision/classification/paddleclas/cpp) | √ | √ | √ | √ | ? |
+| 图像分类 | [PaddleClas/GhostNet](./examples/vision/classification/paddleclas) | [Python](./examples/vision/classification/paddleclas/python)/[C++](./examples/vision/classification/paddleclas/cpp) | √ | √ | √ | √ | ? |
+| 图像分类 | [PaddleClas/MobileNetV1](./examples/vision/classification/paddleclas) | [Python](./examples/vision/classification/paddleclas/python)/[C++](./examples/vision/classification/paddleclas/cpp) | √ | √ | √ | √ | ? |
+| 图像分类 | [PaddleClas/MobileNetV2](./examples/vision/classification/paddleclas) | [Python](./examples/vision/classification/paddleclas/python)/[C++](./examples/vision/classification/paddleclas/cpp) | √ | √ | √ | √ | ? |
+| 图像分类 | [PaddleClas/MobileNetV3](./examples/vision/classification/paddleclas) | [Python](./examples/vision/classification/paddleclas/python)/[C++](./examples/vision/classification/paddleclas/cpp) | √ | √ | √ | √ | ? |
+| 图像分类 | [PaddleClas/ShuffleNetV2](./examples/vision/classification/paddleclas) | [Python](./examples/vision/classification/paddleclas/python)/[C++](./examples/vision/classification/paddleclas/cpp) | √ | √ | √ | √ | ? |
+| 图像分类 | [PaddleClas/SqueeezeNetV1.1](./examples/vision/classification/paddleclas) | [Python](./examples/vision/classification/paddleclas/python)/[C++](./examples/vision/classification/paddleclas/cpp) | √ | √ | √ | √ | ? |
+| 图像分类 | [PaddleClas/Inceptionv3](./examples/vision/classification/paddleclas) | [Python](./examples/vision/classification/paddleclas/python)/[C++](./examples/vision/classification/paddleclas/cpp) | √ | √ | √ | √ | ? |
+| 图像分类 | [PaddleClas/PP-HGNet](./examples/vision/classification/paddleclas) | [Python](./examples/vision/classification/paddleclas/python)/[C++](./examples/vision/classification/paddleclas/cpp) | √ | √ | √ | √ | ? |
+| 图像分类 | [PaddleClas/SwinTransformer](./examples/vision/classification/paddleclas) | [Python](./examples/vision/classification/paddleclas/python)/[C++](./examples/vision/classification/paddleclas/cpp) | √ | √ | √ | √ | ? |
+| 目标检测 | [PaddleDetection/PPYOLOE](./examples/vision/detection/paddledetection) | [Python](./examples/vision/detection/paddledetection/python)/[C++](./examples/vision/detection/paddledetection/cpp) | √ | √ | √ | √ | ? |
+| 目标检测 | [PaddleDetection/PicoDet](./examples/vision/detection/paddledetection) | [Python](./examples/vision/detection/paddledetection/python)/[C++](./examples/vision/detection/paddledetection/cpp) | √ | √ | √ | √ | ? |
+| 目标检测 | [PaddleDetection/YOLOX](./examples/vision/detection/paddledetection) | [Python](./examples/vision/detection/paddledetection/python)/[C++](./examples/vision/detection/paddledetection/cpp) | √ | √ | √ | √ | ? |
+| 目标检测 | [PaddleDetection/YOLOv3](./examples/vision/detection/paddledetection) | [Python](./examples/vision/detection/paddledetection/python)/[C++](./examples/vision/detection/paddledetection/cpp) | √ | √ | √ | √ | ? |
+| 目标检测 | [PaddleDetection/PPYOLO](./examples/vision/detection/paddledetection) | [Python](./examples/vision/detection/paddledetection/python)/[C++](./examples/vision/detection/paddledetection/cpp) | √ | √ | √ | - | ? |
+| 目标检测 | [PaddleDetection/PPYOLOv2](./examples/vision/detection/paddledetection) | [Python](./examples/vision/detection/paddledetection/python)/[C++](./examples/vision/detection/paddledetection/cpp) | √ | √ | √ | - | ? |
+| 目标检测 | [PaddleDetection/FasterRCNN](./examples/vision/detection/paddledetection) | [Python](./examples/vision/detection/paddledetection/python)/[C++](./examples/vision/detection/paddledetection/cpp) | √ | √ | √ | - | ? |
+| 目标检测 | [Contrib/YOLOX](./examples/vision/detection/yolox) | [Python](./examples/vision/detection/yolox/python)/[C++](./examples/vision/detection/yolox/cpp) | √ | √ | ? | √ | √ |
+| 目标检测 | [Contrib/YOLOv7](./examples/vision/detection/yolov7) | [Python](./examples/vision/detection/yolov7/python)/[C++](./examples/vision/detection/yolov7/cpp) | √ | √ | ? | √ |√ |
+| 目标检测 | [Contrib/YOLOv6](./examples/vision/detection/yolov6) | [Python](./examples/vision/detection/yolov6/python)/[C++](./examples/vision/detection/yolov6/cpp) | √ | √ | ? | √ |√ |
+| 目标检测 | [Contrib/YOLOv5](./examples/vision/detection/yolov5) | [Python](./examples/vision/detection/yolov5/python)/[C++](./examples/vision/detection/yolov5/cpp) | √ | √ | ? | √ |√ |
+| 目标检测 | [Contrib/YOLOR](./examples/vision/detection/yolor) | [Python](./examples/vision/detection/yolor/python)/[C++](./examples/vision/detection/yolor/cpp) | √ | √ | ? |√ | √ |
+| 目标检测 | [Contrib/ScaledYOLOv4](./examples/vision/detection/scaledyolov4) | [Python](./examples/vision/detection/scaledyolov4/python)/[C++](./examples/vision/detection/scaledyolov4/cpp) | √ | √ | ? | √ | √ |
+| 目标检测 | [Contrib/YOLOv5-Lite](./examples/vision/detection/yolov5lite) | [Python](./examples/vision/detection/yolov5lite/python)/[C++](./examples/vision/detection/yolov5lite/cpp) | √ | √ | ? | √ | √ |
+| 目标检测 | [Contrib/NanoDet-Plus](./examples/vision/detection/nanodet_plus) | [Python](./examples/vision/detection/nanodet_plus/python)/[C++](./examples/vision/detection/nanodet_plus/cpp) | √ | √ | ? | √ | √ |
+| 图像/人像分割 | [PaddleSeg/PP-LiteSeg](./examples/vision/segmentation/paddleseg) | [Python](./examples/vision/segmentation/paddleseg/python)/[C++](./examples/vision/segmentation/paddleseg/cpp) | √ | √ | √ | √ | ? |
+| 图像/人像分割 | [PaddleSeg/PP-HumanSeg-Lite](./examples/vision/segmentation/paddleseg) | [Python](./examples/vision/segmentation/paddleseg/python)/[C++](./examples/vision/segmentation/paddleseg/cpp) | √ | √ | √ | √ | ? |
+| 图像/人像分割 | [PaddleSeg/PP-HumanSeg-Lite](./examples/vision/segmentation/paddleseg) | [Python](./examples/vision/segmentation/paddleseg/python)/[C++](./examples/vision/segmentation/paddleseg/cpp) | √ | √ | √ |√ | ? |
+| 图像/人像分割 | [PaddleSeg/HRNet-w18](./examples/vision/segmentation/paddleseg) | [Python](./examples/vision/segmentation/paddleseg/python)/[C++](./examples/vision/segmentation/paddleseg/cpp) | √ | √ | √ |√ | ? | ? |
+| 图像/人像分割 | [PaddleSeg/PP-HumanSeg-Server](./examples/vision/segmentation/paddleseg) | [Python](./examples/vision/segmentation/paddleseg/python)/[C++](./examples/vision/segmentation/paddleseg/cpp) | √ | √ | √ |√ | ? |
+| 图像/人像分割 | [PaddleSeg/Unet](./examples/vision/segmentation/paddleseg) | [Python](./examples/vision/segmentation/paddleseg/python)/[C++](./examples/vision/segmentation/paddleseg/cpp) | √ | √ | √ |√ | ? |
+| 图像/人像分割 | [PaddleSeg/Deeplabv3-ResNet50](./examples/vision/segmentation/paddleseg) | [Python](./examples/vision/segmentation/paddleseg/python)/[C++](./examples/vision/segmentation/paddleseg/cpp) | √ | √ | √ |√ | ? |
+| 人脸检测 | [Contrib/RetinaFace](./examples/vision/facedet/retinaface) | [Python](./examples/vision/facedet/retinaface/python)/[C++](./examples/vision/facedet/retinaface/cpp) | √ | √ | ? | √ | √ |
+| 人脸检测 | [Contrib/UltraFace](./examples/vision/facedet/utltraface) | [Python](./examples/vision/facedet/utltraface/python)/[C++](./examples/vision/facedet/utltraface/cpp) | √ | √ | ? |√ | √ |
+| 人脸检测 | [Contrib/YOLOv5Face](./examples/vision/facedet/yolov5face) | [Python](./examples/vision/facedet/yolov5face/python)/[C++](./examples/vision/facedet/yolov5face/cpp) | √ | √ | ? |√ | √ |
+| 人脸检测 | [Contrib/SCRFD](./examples/vision/facedet/scrfd) | [Python](./examples/vision/facedet/scrfd/python)/[C++](./examples/vision/facedet/scrfd/cpp) | √ | √ | ? | √ | √ |
+| 人脸识别 | [Contrib/ArcFace](./examples/vision/faceid/insightface) | [Python](./examples/vision/faceid/insightface/python)/[C++](./examples/vision/faceid/insightface/cpp) | √ | √ | ? |√ | √ |
+| 人脸识别 | [Contrib/CosFace](./examples/vision/faceid/insightface) | [Python](./examples/vision/faceid/insightface/python)/[C++](./examples/vision/faceid/insightface/cpp) | √ | √ | ? |√ | √ |
+| 人脸识别 | [Contrib/PartialFC](./examples/vision/faceid/insightface) | [Python](./examples/vision/faceid/insightface/python)/[C++](./examples/vision/faceid/insightface/cpp) | √ | √ | ? |√ | √ |
+| 人脸识别 | [Contrib/VPL](./examples/vision/faceid/insightface) | [Python](./examples/vision/faceid/insightface/python)/[C++](./examples/vision/faceid/insightface/cpp) | √ | √ | ? |√ | √ |
+| 人像抠图 | [Contrib/MODNet](./examples/vision/matting/modnet) | [Python](./examples/vision/matting/modnet/python)/[C++](./examples/vision/matting/modnet/cpp) | √ | √ | ? | √ | √ |
+
+
+## 3. 服务端快速开始
+
+
+
+💡 安装FastDeploy Python/C++
用户根据开发环境选择安装版本,更多安装环境参考[安装文档](docs/quick_start/install.md).
-```
+```bash
pip install https://bj.bcebos.com/paddlehub/fastdeploy/wheels/fastdeploy_python-0.2.0-cp38-cp38-manylinux1_x86_64.whl
```
-
-准备目标检测模型和测试图片
+或获取C++预编译库,更多可用的预编译库请参考[C++预编译库下载](docs/compile/prebuilt_libraries.md)
+```bash
+wget https://bj.bcebos.com/paddlehub/fastdeploy/cpp/fastdeploy-linux-x64-0.2.0.tgz
```
+准备目标检测模型和测试图片
+```bash
wget https://bj.bcebos.com/paddlehub/fastdeploy/ppyoloe_crn_l_300e_coco.tgz
tar xvf ppyoloe_crn_l_300e_coco.tgz
wget https://gitee.com/paddlepaddle/PaddleDetection/raw/release/2.4/demo/000000014439.jpg
```
+
-加载模型预测
-```
-import fastdeploy.vision as vis
+### 3.1 Python预测示例
+
+
+```python
import cv2
+import fastdeploy.vision as vision
-model = vis.detection.PPYOLOE("ppyoloe_crn_l_300e_coco/model.pdmodel",
- "ppyoloe_crn_l_300e_coco/model.pdiparams",
- "ppyoloe_crn_l_300e_coco/infer_cfg.yml")
+model = vision.detection.PPYOLOE("ppyoloe_crn_l_300e_coco/model.pdmodel",
+ "ppyoloe_crn_l_300e_coco/model.pdiparams",
+ "ppyoloe_crn_l_300e_coco/infer_cfg.yml")
im = cv2.imread("000000014439.jpg")
result = model.predict(im.copy())
print(result)
-vis_im = fd.vision.vis_detection(im, result, score_threshold=0.5)
+vis_im = vision.vis_detection(im, result, score_threshold=0.5)
cv2.imwrite("vis_image.jpg", vis_im)
```
+### 3.2 C++预测示例
+
-预测完成,可视化结果保存至`vis_image.jpg`,同时输出检测结果如下
-```
-DetectionResult: [xmin, ymin, xmax, ymax, score, label_id]
-415.047363,89.311523, 506.009613, 283.863129, 0.950423, 0
-163.665710,81.914894, 198.585342, 166.760880, 0.896433, 0
-581.788635,113.027596, 612.623474, 198.521713, 0.842597, 0
-267.217224,89.777321, 298.796051, 169.361496, 0.837951, 0
-104.465599,45.482410, 127.688835, 93.533875, 0.773348, 0
-...
+```C++
+#include "fastdeploy/vision.h"
+
+int main(int argc, char* argv[]) {
+ namespce vision = fastdeploy::vision;
+ auto model_file = "ppyoloe_crn_l_300e_coco/model.pdmodel";
+ auto params_file = "ppyoloe_crn_l_300e_coco/model.pdiparams";
+ auto config_file = "ppyoloe_crn_l_300e_coco/infer_cfg.yml";
+ auto model = vision::detection::PPYOLOE(model_file, params_file, config_file);
+
+ auto im = cv::imread(image_file);
+ auto im_bak = im.clone();
+
+ vision::DetectionResult res;
+ model.Predict(&im, &res)
+
+ auto vis_im = vision::Visualize::VisDetection(im_bak, res, 0.5);
+ cv::imwrite("vis_result.jpg", vis_im);
+ std::cout << "Visualized result saved in ./vis_image.jpg" << std::endl;
+}
```
-## 更多服务端部署示例
+## 4. 更多服务端部署示例
+
FastDeploy提供了大量部署示例供开发者参考,支持模型在CPU、GPU以及TensorRT的部署
@@ -99,67 +164,71 @@ FastDeploy提供了大量部署示例供开发者参考,支持模型在CPU、G
- [人脸检测模型部署](examples/vision/facedet)
- [更多视觉模型部署示例...](examples/vision)
-### 📱轻量化SDK快速实现端侧AI推理部署
+## 5. 📱轻量化SDK快速实现端侧AI推理部署
+
| 任务场景 | 模型 | 大小(MB) | 边缘端 | 移动端 | 移动端 |
| ------------------ | ---------------------------- | --------------------- | --------------------- | ---------------------- | --------------------- |
-| ---- | --- | --- | Linux | Android | iOS |
-| ----- | ---- | --- | ARM CPU | ARM CPU | ARM CPU |
-| Classfication | PP-LCNet | 11.9 | ✅ | ✅ | ✅ |
-| | PP-LCNetv2 | 26.6 | ✅ | ✅ | ✅ |
-| | EfficientNet | 31.4 | ✅ | ✅ | ✅ |
-| | GhostNet | 20.8 | ✅ | ✅ | ✅ |
-| | MobileNetV1 | 17 | ✅ | ✅ | ✅ |
-| | MobileNetV2 | 14.2 | ✅ | ✅ | ✅ |
-| | MobileNetV3 | 22 | ✅ | ✅ | ✅ |
-| | ShuffleNetV2 | 9.2 | ✅ | ✅ | ✅ |
-| | SqueezeNetV1.1 | 5 | ✅ | ✅ | ✅ |
-| | Inceptionv3 | 95.5 | ✅ | ✅ | ✅ |
-| | PP-HGNet | 59 | ✅ | ✅ | ✅ |
-| | SwinTransformer_224_win7 | 352.7 | ✅ | ✅ | ✅ |
-| Detection | PP-PicoDet_s_320_coco | 4.1 | ✅ | ✅ | ✅ |
-| | PP-PicoDet_s_320_lcnet | 4.9 | ✅ | ✅ | ✅ |
-| | CenterNet | 4.8 | ✅ | ✅ | ✅ |
-| | YOLOv3_MobileNetV3 | 94.6 | ✅ | ✅ | ✅ |
-| | PP-YOLO_tiny_650e_coco | 4.4 | ✅ | ✅ | ✅ |
-| | SSD_MobileNetV1_300_120e_voc | 23.3 | ✅ | ✅ | ✅ |
-| | PP-YOLO_ResNet50vd | 188.5 | ✅ | ✅ | ✅ |
-| | PP-YOLOv2_ResNet50vd | 218.7 | ✅ | ✅ | ✅ |
-| | PP-YOLO_crn_l_300e_coco | 209.1 | ✅ | ✅ | ✅ |
-| | YOLOv5s | 29.3 | ✅ | ✅ | ✅ |
-| Face Detection | BlazeFace | 1.5 | ✅ | ✅ | ✅ |
-| Face Localisation | RetinaFace | 1.7 | ✅ | ❌ | ❌ |
-| Keypoint Detection | PP-TinyPose | 5.5 | ✅ | ✅ | ✅ |
-| Segmentation | PP-LiteSeg(STDC1) | 32.2 | ✅ | ✅ | ✅ |
-| | PP-HumanSeg-Lite | 0.556 | ✅ | ✅ | ✅ |
-| | HRNet-w18 | 38.7 | ✅ | ✅ | ✅ |
-| | PP-HumanSeg-Server | 107.2 | ✅ | ✅ | ✅ |
-| | Unet | 53.7 | ❌ | ✅ | ❌ |
-| OCR | PP-OCRv1 | 2.3+4.4 | ✅ | ✅ | ✅ |
-| | PP-OCRv2 | 2.3+4.4 | ✅ | ✅ | ✅ |
-| | PP-OCRv3 | 2.4+10.6 | ✅ | ✅ | ✅ |
-| | PP-OCRv3-tiny | 2.4+10.7 | ✅ | ✅ | ✅ |
-
-
-#### 边缘侧部署
-
-- ARM Linux 系统
+| --- | --- | --- | Linux | Android | iOS |
+| --- | --- | --- | ARM CPU | ARM CPU | ARM CPU |
+| 图像分类 | PP-LCNet | 11.9 | ✅ | ✅ | ✅ |
+| 图像分类 | PP-LCNetv2 | 26.6 | ✅ | ✅ | ✅ |
+| 图像分类 | EfficientNet | 31.4 | ✅ | ✅ | ✅ |
+| 图像分类 | GhostNet | 20.8 | ✅ | ✅ | ✅ |
+| 图像分类 | MobileNetV1 | 17 | ✅ | ✅ | ✅ |
+| 图像分类 | MobileNetV2 | 14.2 | ✅ | ✅ | ✅ |
+| 图像分类 | MobileNetV3 | 22 | ✅ | ✅ | ✅ |
+| 图像分类 | ShuffleNetV2 | 9.2 | ✅ | ✅ | ✅ |
+| 图像分类 | SqueezeNetV1.1 | 5 | ✅ | ✅ | ✅ |
+| 图像分类 | Inceptionv3 | 95.5 | ✅ | ✅ | ✅ |
+| 图像分类 | PP-HGNet | 59 | ✅ | ✅ | ✅ |
+| 图像分类 | SwinTransformer_224_win7 | 352.7 | ✅ | ✅ | ✅ |
+| 目标检测 | PP-PicoDet_s_320_coco | 4.1 | ✅ | ✅ | ✅ |
+| 目标检测 | PP-PicoDet_s_320_lcnet | 4.9 | ✅ | ✅ | ✅ |
+| 目标检测 | CenterNet | 4.8 | ✅ | ✅ | ✅ |
+| 目标检测 | YOLOv3_MobileNetV3 | 94.6 | ✅ | ✅ | ✅ |
+| 目标检测 | PP-YOLO_tiny_650e_coco | 4.4 | ✅ | ✅ | ✅ |
+| 目标检测 | SSD_MobileNetV1_300_120e_voc | 23.3 | ✅ | ✅ | ✅ |
+| 目标检测 | PP-YOLO_ResNet50vd | 188.5 | ✅ | ✅ | ✅ |
+| 目标检测 | PP-YOLOv2_ResNet50vd | 218.7 | ✅ | ✅ | ✅ |
+| 目标检测 | PP-YOLO_crn_l_300e_coco | 209.1 | ✅ | ✅ | ✅ |
+| 目标检测 | YOLOv5s | 29.3 | ✅ | ✅ | ✅ |
+| 人脸检测 | BlazeFace | 1.5 | ✅ | ✅ | ✅ |
+| 人脸检测 | RetinaFace | 1.7 | ✅ | ❌ | ❌ |
+| 人体关键点检测 | PP-TinyPose | 5.5 | ✅ | ✅ | ✅ |
+| 图像/人像分割 | PP-LiteSeg(STDC1) | 32.2 | ✅ | ✅ | ✅ |
+| 图像/人像分割 | PP-HumanSeg-Lite | 0.556 | ✅ | ✅ | ✅ |
+| 图像/人像分割 | HRNet-w18 | 38.7 | ✅ | ✅ | ✅ |
+| 图像/人像分割 | PP-HumanSeg-Server | 107.2 | ✅ | ✅ | ✅ |
+| 图像/人像分割 | Unet | 53.7 | ❌ | ✅ | ❌ |
+| OCR | PP-OCRv1 | 2.3+4.4 | ✅ | ✅ | ✅ |
+| OCR | PP-OCRv2 | 2.3+4.4 | ✅ | ✅ | ✅ |
+| OCR | PP-OCRv3 | 2.4+10.6 | ✅ | ✅ | ✅ |
+| OCR | PP-OCRv3-tiny | 2.4+10.7 | ✅ | ✅ | ✅ |
+
+### 5.1 边缘侧部署
+
+
+- ARM Linux 系统
- [C++ Inference部署(含视频流)](./docs/ARM-Linux-CPP-SDK-Inference.md)
- [C++ 服务化部署](./docs/ARM-Linux-CPP-SDK-Serving.md)
- [Python Inference部署](./docs/ARM-Linux-Python-SDK-Inference.md)
- [Python 服务化部署](./docs/ARM-Linux-Python-SDK-Serving.md)
-#### 移动端部署
+### 5.2 移动端部署
+
- [iOS 系统部署](./docs/iOS-SDK.md)
- [Android 系统部署](./docs/Android-SDK.md)
-#### 自定义模型部署
+### 5.3 自定义模型部署
+
- [快速实现个性化模型替换](./docs/Replace-Model-With-Anther-One.md)
-## 社区交流
+## 6. 社区交流
+
- **加入社区👬:** 微信扫描二维码后,填写问卷加入交流群,与开发者共同讨论推理部署痛点问题
@@ -167,11 +236,13 @@ FastDeploy提供了大量部署示例供开发者参考,支持模型在CPU、G
-## Acknowledge
+## 7. Acknowledge
+
本项目中SDK生成和下载使用了[EasyEdge](https://ai.baidu.com/easyedge/app/openSource)中的免费开放能力,再次表示感谢。
-## License
+## 8. License
+
FastDeploy遵循[Apache-2.0开源协议](./LICENSE)。
diff --git a/docs/compile/prebuilt_libraries.md b/docs/compile/prebuilt_libraries.md
index bd58fc4b0b8..f960b105ba0 100644
--- a/docs/compile/prebuilt_libraries.md
+++ b/docs/compile/prebuilt_libraries.md
@@ -1,4 +1,4 @@
-# FastDeploy 预编编译Python Wheel包
+# FastDeploy 预编编译 C++ 库
FastDeploy提供了在Windows/Linux/Mac上的预先编译CPP部署库,开发者可以直接下载后使用,也可以自行编译代码。
diff --git a/docs/logo/fastdeploy-logo.png b/docs/logo/fastdeploy-logo.png
new file mode 100644
index 0000000000000000000000000000000000000000..fe94ed2f5d781c19546bf05bc7b3643486ffd618
GIT binary patch
literal 133793
zcmeFZXFQyJ)HTW&41yrh1yQ5-UWO>qi9|`15mBO-=yf6@q9sJHi6GHybVi8YqIaW5
zAH8!1ahK;gAI|&U`Eq{0e2NUNxn}RZ)?Rz<|0PgeRUQ|c92*4%1y|vo%zYFTf?^aD
z)J_lv@)ZLQAr|C6sE+sLrBQM_DHl;tXiyYnq#n2$tc>Fp&zb0RBG%8^$xK~wG3yE*
z25w~DMkO5ZuPCu%V{fUjh|Ai<;O52qFz|5MLinAcEBdTS3|U&$_04JZrT!J%jdY!7
z=;$#}*Lpqd)Neze_V!rEPRBtX550Ha^3RUM96N{2e3D8`adDCs_I#4MP8WyGYCX?X
zt^t6c>oj0r6x84Gg}PMA7xlYXzvEJW-O_caK-_P-y~}H6f?*!rX
z&$ns1)3H-;|4t;*bj#uYo(Nvv75FA9RNx`$P1@g?9jJi&?*t+64@Lf-68b}tzlkpX
zP~>l-i$4_ko5=hRMgF14|I~B;sL0=h#6K$X_aG6eOaG|I-x10`gZ6I{D}N~R4@Lf8
zJ63-N?f)cC{*Jr)A20Iv@Yx?P@;AArf4s=wV`u-N$UhYM-yr#CUGsa8_-9@7n~?bb
z!*$Jws5Q^kS%Ck(4*TOv{*Dp;&vyIwNM7Xp`e$|Xn?m_#b@Q9tkUtdpha&%z-1y^5
z{w5^;S>5~|B>q|5{HAjze^xiYNv!;#$UhYM_X^?vu&Dm{lK)Ab{2li{tDC=L@qbu@
z|86LMRyY66C;m(3{8`=n9>e&5P~RY>2lgB$407M4>&q?@VaE2OaFMXz$qOG5tTF-<)%{wsOF0Qq8Haf88Ke%xTe_hy*p2Kv}tmrt@
zb6fBk%b6qGoH^S~MOt+5=Q`;*t+W;lC|CVzPxA;v{6@2q$gAE+)iWzsB}La3sHdALuao7a4DxTkeg3HxPM|7wN*uV0X&
zB11%npb@={_vbt6UuVl>n_np}G>g6e#MiQ~!^7Q=Q>4&47WZW{qGM_7@m7V!U)TzM
z2~%Tu=0`BfsUkkIzeQgwvW1@c4#oc7#l3N*fR9eQmgw`89jw9y`qHTkl^u-3&)&^u
zp$&Zf^^r1Rt@KQcZX?QlXNkOZS!V*7G@aGV=vwp#)%0c()6T?UV&)X4TaEM-W~VOs
z%mkEg{uhrxJ_GUYZDz*~+TbT0==~5htYyFJ6fRpe?4XO+qNCaoUYu7V$;}Y3eR|rM
zyqi4QWb?S^2=ANV>HA~ZB7x*b4Q!!uR|z=hXIV}pgZ_|gz`J+x>Jrxb<;^kWg>dDC
zm=*JStHyAqwP-Q1*yE$lr?rfgE}^v+i*e)80xDIg>S5j*+AQ%!d^5U9Eh|=9W_uqc
zvg?0E%Y$&ggq~zus+hH;PFcRohaoCqq;7uB7b*Cr{bU!M;Wz!T3%ggcJ$%Q%k0#9Hl7H7w`V8b
zs%^V{a`N8jWK?+0GY9!}2ah5SLsBUN`OCKs(oH#0Z_!s65oO%rwNc%?d!o!2a8V7x
zDKO0Ngo&o|(aE|nrOgA0`M66;meYkV(R2Dx`tIToG%>}JnlPJDyF*Kv4tDS|bRuUP
z@wQYL{x{y$HR0{_>+T6JL?aSt`qCMXk}M-#=4(w^^3)=zV-K=jbAry+O;y&@*)jeV
z^=t2t&sEa8=UYWyl4}51A?{d1wevx)ii3puAkf~XIFCYaGnpHPq13oYXJaM$8Cw`lGz^Y^q&*fLB-_CB>7
z@kX+DR1_Q5@Pn@FhyfP5#yV=+0+S+U(>ZY3FM~iwgZN1QSR`RfEkdle-cu8c4I5f0
z+@-uZ_C8xNf{F1!Cr+jkm#LD%nXM$#*Z
zTCL0|c$b}D;$O9N5tcr}{Umx9dg1)A7t||TuV}p9^UM_hTjzu&jD>AEYtKAwiWqG-
zng8{RFa+`$YOe!xlACVNLXjl`5U_UYVdi-MYh&5?t{o_54ZL||f@
zMA^wQX0F6GzhSkDfxyzoM0C>$p`nD?S^AkUX2jfky|?Lmv?p};-KY4~)OfdCHeYs<
z#K)tX%;#<#cHI9!&hmZZms@aQ&+6
z3oFI(WNuM&ekshQ82DP}(qG7m2F1uulNwJ5j=wLA#F*%sa#s<<}(?s#V*e0SN-
zxFa_(aIUs&B{-mzUWJ0?37Z|VsFvM7@V6AA*YpqDtm8b}gzg4$m2>*^XRpihkO4vG
zKf=wa5FvSwlgBzN_0w!ucvZD-d_3`ffmIj^R2=76Dco4(wto2R!(S2iMz(++?-*;7
zaG&@C^--<>^5WXP#Ms##g_EA?pnU9zdum@VSrpOI*
z+v1%mB>$}5O|k04(R9ya{X{-a-miG
zmuH(B^r^dgh>7=AqBG?m4|F)b(fYbl
z#%GdMd-x;9Khg1&3%z_bt3+D9qh%Mpgmr`7XBGN5vF!EDnwL=4we-o_k@czqVX9
zuq{u(9Pj9@XfTM`nappwStT(aD-E&5K7J66?_34rjTrl3AF>D}egU(LFc{Jq-0)Jc
zS*~dIyplryDEXr6=98GVw)eOOJ~ZYrmS-c$h+00ys+E(mnu6qOX;WfbL9T8@+rs9H
z+)JHyRpov-!6VTzn9+sd7a0qF3ofVSyyu`0N0~IPkRCx<#U_04`sw$&7N#}j)5()+
zwjG}8r=tyPXx_#Ltfwc5bTWL!^6bmJNs;CJE%tXkUJu)PvZ&;?uO*BoBzASxBwzzU
zeT2->3o*fSQK-`Fr^YxcHcBmz9Sy(oEqh4TA^$?gtGf{Ga4!*?of8G`^viG~bO4gc
zpH8Zs&&02LakStw9rV2A`f7F{6TXX+9ws;fDR`)GI={hqxQ_D(o=qAcmQX__
z2dO5OE(Hh8CjN9H9T32}Y~eBHP&1ao>9c>=G}kAn#6)w%H)S{d&eleCgQ3O1Y^KJhv(`To
zc&!NLc|0xDZm@evV#L+R3`iy|>P0cin_4ionIY4DXU`sFCiyua+3iXSyqJS*dMt?d16iGoeu-~h=X@s*!;tfjJ
zR*ST_b;Nmoz2EWfj;6reJbi|<5CTp29g`qui%bB2I^pL#b{gT@E^PaB3a6RpS!V-6
z1?kdW(o_BNT6yg3N
z{c1>lMevS-cSak)ky;;F{S*wUyK%F}@Jg0{Kn7TW11es$F-5<4O{W+z
zcX;gD=wPoHx`>>XQcoUj@MiJS4vq^?o|(aay%k3*X{>=OVc9RC{%LiGW5O
z*}is#2MR`T6%5AC0S>ypuLX^W@Dyv=#N%6Fbkhtt*y1k1#WC;d-x2q9Qs77q!>u&E
z2|dBM_PBzpB~N=8dK*X_1J+G05WBON*76~;bU?z#%T{q>B)fwu(vq{)!Nf5teH$6Q)H>vQ_jFum<2?wpC;q*=7
zcZBDQpIVu0cKO7PZ4N)J_{2uVD(V2I_F2j$GTcaAK3n&zcj(lZG#id_No$EsT|-ze
zaJp#}IU7tuHr`$*bwf5&>U$Brd*m`L(L7N&ys;e}75n8SgDE!|x$?nekU5Wt<1v(F
zHsqg(fC-|H3^qx|&K`+|izUo{rZ-$Fv*E}%dlsHd3V*T3BrMW`hL&CO^b~!>uFSS3
zv)4qBBBH7gSzcJzf{+O&>Ya_ME}PtSTRBsR*oy0~PQKG<40_+;X4(p_`P@}TTI(cfMemma`GlS8TJ)%2dE
zqCt0dN5G*Wgz=(Ae?EqHC`@$kt=KAiTCY29Ho^dNZ?c=8o(i`_U~upp`Eyzi%pML$
zYe4@_Bk~vL6_K`wo|ge_V}sFZnzuSHSV*`8mjAf6hQLxht;wMur=LEVe_`oBqO55RK?pD2NkeIr#@iD
z#@g%cb_f?9k~*4OY2%4iiYxLMOYOfz2jho*|h|IjFCVU~H$_yOp
z7*TAvulIVUnck&CrXR1cc2cp&8>JgbEUOB5D`ifSnxZdaCBy(G#s(3LB%MD(ZCiCZ
zEjwF79G_Qzs8B8!%P?c3{#mscE+}9*&RtRzk6r?li2vke({DlN|0laPUc{WZf;fsh~<*r@s++Nzwz8o)tz$tpkn!U4BcZ7v2&1%*LCc2J(
zGSM?-D>kaCwGPa#pADXwt&73vSu+KW2ioWLzi|jJd;&_n`~3RjJ$0$4H|cRi#e-;`
zpuEdxenUXNqKYqdLji!{9ZAnZD)j&?E7hXd$WZjmbjhbiwA9Es)-LPNd2cnzG07}R
zH%8><#Hg3;c&5lCv@#h5-&>JFZ7=(D96#A*Xi9KWOewauBkrXyCsXEjZ~Ujqot1gV
z+=f1P0ds3IR5}DXs*glgZ=26j`|M5lZA!BWytO>X(SZa4EdbUw*bddMCr@@-8oIJ1;H%C542pEL!X5-A0zgm4DiZ*s?gQcR|^LT%d@N+
zHS#+S256j8*yu#`ZI65ZPrYs2W1lgqLtaB-b~UsUco
zmsGr*da1z0M@7ERjhA2+Q%AgfhDU4t?O@4eHgwyGn~kfO%CF37aRm)
z;=OKCa{1ymabU>KcTg}Pz?kNp*0mC)6sWF`w1?$+w3*4F_yEbzg_JT8C`6quz`8+yt
zFMMmmo#!_JDJKD>l_QZuf{i~sukvv1F8C;Z6iC=J{EPveABP?;9%^%nWp(3cT-rX-
z+m+O3`X^Xohqj4sKhK`cpbpk;>9Q2^oagUfaYv<7R|YVeYql>xspk2W7IjC!sp7;N
zGrUkiCi%iqs01Vg3lXIcY5j2jD|EsTroZrw3jZ!ezNctd$K$=x+R8&E-gHC$@fW9-
z-f5fGFc7sxHtcN7Hp)59x;sGFv6*Ale!^Ux|{8%t6j
z-s^WMzz@yx%kwwcz>_n@D<{yw{WuVij>5SxfRoh>>I?IoK$Rf*kuZ=>9gN^5xhnb`
zX^0>^(HHnJ1Jy@)&ngeqz#>v=LCc}+m9m(7XQHQKS79N0#G|YuJN&>H;z{wq>ill4Z{771IY=-MxbdBS$k{qvd)7YJv52rm
zkQW&t3wauWA;#Z%xQBCDi$gn)og*ZOWETiDdp2O)EKbbP+vv5%iS-yrIlMSJD
zR|LhLe<$S`);@bBmQrgve_u?(w
zj&Ztv#h5R|y{Pzu3&sC3Pb?j~)f2FwPdaJRtK9k4|KI~Mp
z*^GUZP`uSyWc4l^OH#C^y%be{_EEgV!?0jsSUXEw&Sy|P~#-vkGIW2ud#b@@VxvB|2f$devR1~sL9YvPEA+(yiH#T%*D#6Luc
zX?52~yjTFZmvvJ;9Ficuk?7P0ddkRkhsWbY7G@OtxZuSF;SYj5vG?Uv!2^I>EnjlQ
zG6P#=1Rqq!qxy8EYHG>rW9eb6kPJ;rLp=XV=pu&cjr^k4!o4DydplUU)VR`_H9?C)
zHPoK$l*vw1p5jAYl)>N~pfiRynN9}|j0Y;AH3quj@7qvM|M8ZEtOpbJYHg(8F9=+}
z3x*c?plear>3;mNj&e%W`}*J_yl=p;E>rAPG5liL-35}|299?%^QKp=hzlTcPy+&m
z1dL*r)xUj3fEdpXkYBhi?S^l4j{qlwC&ad5(cE^V9Tcnx@WvbMiFz#?`kr#iJqhl`
zXu-=O$tYlc#Q?}C6w^a3445AizQ%36LLe*A&+-@JBjS@E!*GeX;}tm$zZk=<*L_X!
z0AJVS;p@IfT`wmm@v4m>BRWYGr6aYs;wxJxY{lS$XXYlVIbs5n-w%ZFx}F7(Jm_Zp
zE{PaZqrjhbduwp(Df+E1P3e9}FWB9U$hrz^2oOHGj1jz!MJZZ|w}yo%_$Zh14yXv$
zZp0hIsiKW!Z~NI3^W@Z1>ti$%d9VAt)=jH@V@jqyL|M%yVX*CL{a=>RU{m3i2YejP#C;S%+)SbtrD+>Y{d*uF2x;igrbe9
zwPgqi0aLTu0N+T1KPD5NUE}wf-x(I<{gqU%IFKHyr_hWqCrW6u&cGX_Ac6p~6=2a3
zjt?nho!2OGZ!jd04JVXed?OgfOp{NEP;L+!#9RwQQ1o-uw8z4@AA;3;dhsh_^|WQF
z<}HA6Nnn;6ZcO2B9uU7ut{VX|jrhf-8&1T9
zQXj23xG6e-Q8odBwisDd=vFw&z-QQ1oBI26vWN$q$_cLx-NMiI#!5`b5kpWz@xz~~
z3<1UiFOh$0ZdzkLe^hfLu5xjPTCAfs==1SvmkE1gAyd*`+HfaE7teac#>%RQTQL@A
znv>ge!fySeVHjH%HNac_xn52D+=iI`ws-o;^7jq>5uxu?Mo@AUhmP#{?@TExJ6*ER
ziz*RyEd+i%=WWSC6&x^{lzBViXE;?AsZiN`b>{$#7A8CV#+1bg%MOwruJ!pI@Cn*G
z?M1`uJiLG$StQ4j2o);GIY}lF-P!P3K&bRAw2yC_?4ohBtsoxNsg)C&$((cO=P2Ko
zI5#~hpSYmKzY@~XrBkN?6hQ9?gt2wY7Uy1)8dx7>g>}r3!jRali|NV+YkEU|kvv(L
z%LY0nlkD?M{GlI_`*&aFmH)+pZ+shQ)`DO~(x0xY%4p_HP3L_|EN?k@^Xwg6T-Xd+
zt{(;6Nk<2$_!{cucP4u&k0hT|KlVrVlVv6xnYa!AR4p{Ts6@pr!L8UAZ;I@~=BfaL
zLVoRh+|ah4CRFsYyaKr6@nd+8Z2*>@z3`F|eC!
z>hqXIT$ps-LV(umgzorJb`cUs{sNK#Uy8X)yAwUV7K(@61*$#2cPPJn)sDQk*G&0h
zGJ%io2)v|#Lzj0kmhz|$F*_wvVg^gV9`H1vFH-%uE=EB%&`T};sI56eKo
z`tCop4yBT&%+4lYz7EwP5XxCT&I8Zn|4N)M!DoJGJ#-6=;y?s3d;w16Ju)L>-k~E_
zib}YHU65B9K7JE}1eI>4UKJ|Dq;GNkO-VVTc~+jh4(
z1tTCy+D~*L9B=pH&a_}uZq9v6nZnR6v<%te!aH^@FO$njq0&UJ4RHLnpbw$rmUgzd
zM)OeCpEELo5t!1HL0SXR!bs@L)P_1UbOxU1L!BMfYprm=VR~>X&T1Vie0>Dj2kkR7plYj5#Urz;8S#vot5M_nocveWC!XX
z5zL-q&+M5?K&T5|^v190j({xm!{Ec|ns!@XDnBXLr1TKMGQ1?LU4A6Fd<1$5E4;ty
zPJQUd8X~l3qwz$5?Fu;;9HMpm?cUU`))Ql6?qA)syAMcw>YLNV_9WDDfxU&hXX8su
zht8a(A(?=yWEno%@~0;PidEi%V&05n4y@bCuM~_Ib&bY8o>dh)xQ&n3y0fihOPi`F
z!=euqAHYkEZ3pYZ;$43ZIMH=u1MyHf-jFtLz_jU(m*%ItAh#nv;@L^eDQ3^+J*Dg_
zq_$&A>R2Aws3YGTQx(N5BRkIeaFX8`0(K-?DVe1+_GkI$>-S3~Ri
z7!oOrJw(!7)AqjDPmKX^ki@IcBzPmUcYy3NCFcA9LX#*V$JfuO9~^x!CBxds(S0MU
za6gfoOdo?-bSSb+=v$XwFt`orONhNR&VixUv=gz7XCKu=jZ3Dl-X2NATmW`u7|{CB
z{`5%q#ed}>+k`Pb!XCl?j-wE3VgON(9S-{)+kEWoBBo;P2CETGbv0O35PVP@7J=8O
z<~uj)TH&>n%yGD?ib>_tli(9-@G*ziVY1`d{%6>FwO*j*qkxwo4In-SXc$Vg{Pr>S
zy#@np%bNIt{n@~q7LNLXa~L8AJlOLR#6!5|eWL5ELJ~RfrfyVSH##mwEarPcrnzn`
zT$b;7ii{{w>8{JeXJEqix%KWR94xH5HB`cQgs8n$t4hAW`|WW7uomB>cOSk=eS+zk
zi)u}P`XYVTr9z(+
zE^W-P=M40OPfUSJDsXw20W@S}?9TO=o18aLbLcF$xcbOTFepaUF3ka?przpin558s
zJtHDs%&xG0S~5N%@^NQd<(!7^p=f+Fv%{cr1&-C=vRT(YBH3=&b{`c?>Y1#GhfD%Gl*`St;}>n~kpYoQ=`!vZ)BJ?Ju-?
z=rj3Up$r~EiKyyw`iM()Zj+G9!^KaQjO)@<9-wCsk9d+{W%$>mb~-XJ;tFy}#z#I|
zFbC`6X9%Et;}r_c$4$d{Tgn(82-AWk(}{ReB1%A#j!3FU5<(fZPN`h|HNjzE<)O+H^NDyNZBW1l3UWWg-KD(llvCZ=u_$>WZ19>Kz3qsS!k0f?2^Sm82|}F?Bbe;3@f>z0
z3*2eG^cF%x3vsSjg22L>LVBiVSG7l~w3
zff9}XElSG@yipgtS<~X)``HQiG
zVJLvLVb;;TMsKU?e~04Bud=wnDwq}NhQ(`DWePA}UYFDfm{jF3?c5fkZ>Q^_HaQX~(E;X;pJS|J
zd<^S(GZp=L>(l$NbksNUMh1fW+KP&OhfQF_^rP2q+SoY6+eM`HiYoy#
zMX+?*kU8~T?~$Rd4lr3w2QoXZA+S)G?QLI5^aP4?rh|ckRRQamxnHKbDcn$O1V1kx
zjXZ+U@jo?txS$07M#Uua;0yREI(l{iw5%0K
z@L3WyFtwUBUw4(3gZiiO_bbiLCkjN;@|ReIcgGsT^u*Ze)2VzEpER$bW48uV--3h1b#xkQ>cyb9CF(1b>}xG^;1kJZ!svv_
zGMmS3aw!5x*WMUvE7#{8$7+B7CAGHzYi}cyT{|!;^b1{UlOt8c@F3OuY(kPwPqL25
z$`_*Z2QkaDqZdsilPRVC@WbR(hi|R+WN|-|*`>kje75*j2Uw8EtBD#3o|IuMN(?)JjZ2@cc@u2jgjCP^So=1ijgDF}sfJEzOE0@U
z1>3rtn(Qgj*rvCN2oDw~mevRP8frKoC)hr0t
z9hJwpaw)VxZ+t6~p-mrO{tF@UtH~fY2b^{$4K&`kD>p7_;2#--As;1zllX=gz8pG*
z&C%kemS?X|U#BEN=W*?e)DcH64k(}+cai?3*ZtM#S0f~Pftu3iP56pQFS;T#A=13M
z)6<_{b9H49jomr}t8Mk+`-ORBi{!H?834qJ#gz9JsU)$$*mW%pv3T$2uiRUjPR~*n
z9Li=!5s2F{d^Z`l3m{(tw@wjxJ9y&onmo1Rd`KogLPjHw5`tj
z7hOIkqW84{HPDmue0)Lfx)OmΞA_x^Fgr@$Gbvqqj~*jmk7i&B$>4L@r?TaN!=@
z5@rTiUJe2c;ahz}2BYyhV*L4ZEYVoSLVRw78s-u1Ztidg{ozj(i4Y}X3rowSBlDT+
zT)!jBQFWcwn&86ZB&zwy_Y!}NO#DDblw=-FO|}E`TmeV<$?#oJzZkl5g7R4m^*Y5U>o!gOXUi1#7uUq`6@IwOkP!UcboMQ@0mQ=P2jX=z@sebwt~w+
zfPi{Ftm`#D?mKqHmYGjgh~eK$hw!9u
zeGQy3iU@&x-$Q+aYj9_d4^dEya%>BK$2-;O^KIY0TLN8+tV!~$a)P0@!aAsv&a9(F
zmJ-ru@aYBKn2Ws_9y;BtSYz!$qj+JX?(F9g1aH}5Y+_m~jzMsoA)oPP54x9Hv#Ct7
zA39t%%pX7|UDGS>YWTX_%rE2%hep_Ya=madm+UknPqcA&u#bSPZ&@#+VHT!X>d{Q8>n!X_zMX$pv4x)LkcQfDMg&{qw8pbSMFUK2}TE{vpX
zI=ciC?))OXK;zrTubyw1@g@~0_=I9=wH2*NEx9WO)#o-OTa2mBFL|Bwsj0JT4*WFt
zsjennS*(&1FbXpXFI(D!rQ`oJt;PH=vlkt3?r8~K#}YmhzRcz+e)mDZ%6!bC
z$j{a_2tx?i8yIJlyHY8P-j)4YBY9Raf%t!cIfVmjJW
zwow!vR|`QixW|Can2sFbn!d>`e_N`AdPwHWkq6Yy(w%9cQqRd7I
zTn{<-a?YmoCEV-t3U?Y6`6;00Qt<*4tEP6EMInQSmcUoE`L_|M5L<{Pq#X4d11G=-
ztp>ZLQSY5HlslXYryw1azH#9YyoEBDG!=XlWfOuiYx35_cOVk3t!Nq>#V5n}UoXQW
zm8puP!;&+-zQgKBO1J**ze=NU0GE{f2f*#=S(#4>jf!dov=z3yv`DrS47`
z5`Im!GGr1$ZX%W)4TCwSU9)g9f3Dp1^qXH8DYw>
zB#M$7Uw^p6YH}NP=$te0&^=>kZYM{O*Y`X9{K#(v#2|c4Amd5;HW^tmS^yU`ThL0?
zzz;$E?Sknm{KX~u0*q|KNMF9puLnrekt5wx2=I}gZ?Y^!R=&7|aBrcUgZ7Jb
zOu~XAZqH7t&anZ~l8|p#wxlTlUkV{y?}Ty?YeEBW`+3}IO7;eiZS$7)^uonx0wgjY
zC@VGJ1mJ1~lsLa~{=SO(ZA0>TV%k@&hiZ=G43L
zxrpWCtzNU5riL-{+ibZ(hRI8f`WNKEdts@7EKQi7j6?_0oFYJ&;GxtY!M>#z7z(Ow
z=QzF1nE#e;gA$)Dd6+vJf*)G1k2#6!*j8K2>3HtY#qBvTGyYM>vT2
z>zJ&TKx{d}fnm{Z0{g*W>gDp-z@JG-ZvhKxn#_uP
zID@#Fl{9o_Cf9HH|I48NiWa_BWf56DsO!8o7?o5Csix0+^OTekfbC^PjYg4;x@?1K
zEb+dvFhnKNf~91sAP>mC_f5$W-f3xMWauLv0gs*FU#s1F7lJpmK_@tFq5aVbdA2|z
zY^E4a;65=Ya$s^kze1l}9|G(%=Tqmq%S7L}ei#U()erIkL
zo`L6!WPbL+Iyxg&`!p{>(dme5Sfh>oTptG|e@?9;SJAgKOz))V#_<#>eQJLB=2fi)
zCfa0RSw3NGg~WxkNQcGjF^|4D_O#49A)-@qo6t;4{M=W`u8AYCEF&VMt#Xe|;>92M
z!i`0*$owOFANU+a5%Ytz^0(NmmqLAI*)PSvs`=1=rZ31FU3vQ~!pc=^sLof||5obY^Hu>Ssjs+d6%b}XsgF-Tl|bBK
ztx|McDI|xU@{yXIzZ8q|NEnn~%a^@0pui&o3mko;NTb8S{njzRoh^yjF8ycHZb9p8
zpV>u0&(%up4k7D-1Ukb))*ltfHQ)L21_7d4e@=DGav@r0K+F_{FQRscIKm5EYma)s
zid3Dp#ou4oFfV_14_Vv4w|d0mgDybvWo-F$!ngg_#Q_U)0{A-<*HZ34mvX=E#!6jB
z!XZ>4ELRs6sm~hso-dyDK(H#m=GEo~pe@Mj*>Km~`Bg#pKF`mPiQ3VotOw}s0N1j5
zJA^l;jj)w!-+g~1Ts7dn-s%qyLSj)Xo
zlt+=weORw$Ye|Thd#VrqW@4XSU*H}-
z^rRMM0gaw@frVZ)06ye?xs_hxYdZ~0UeRl2m
zYv}Ek5>oBd<-TKSw!12IW*;mCK3MCxAsZ6%m}c#36J1!+Vr-N?n%^cS#ax%(oiAM%
zN4mI1u*~(>*%!(9uq@3_vl_5`$X6kc%#TH`aZWA0)e#CDQk)%NLqnB}S216pzUWOU
ztOYg~Hj5vjd4IXX^V`H-Qn^HdvvHT_8LozRXgb67O~T1yzv9rp1d%Dgr^X_rG^c2L
ztSF=RoBrbD}gE{Zl%r_z3|W69(6t+wKu3|wb|F&@OnGcNG7
zu5N0Jy^qPe4c2EPrh6{yF5^@k?X8CrJV=Y}DZKBU5{JgOqN3$%5-{qqU?fekfPUtF
z4vzCc9`+i@{+L^*7euIji|%lf6Ox+XHfxZ`$rP(zdM&psEM#1H4R<%Ir_QheNKB8)
zm7F7m5lvygvM3TBP^#-lp#Ij$Ee5WM#j=7kHmp~l>1Jz;PBwu$c7JO^7fp^v60nY*
zi<51m3136b$u&aIw8MUVKfL0H_3BUF?8L<%1DBk^`qS7`u^%&OpVbrqYa-qlz1*L~
z&!(PM_bbF^>8E9@?!qoXF`?AW5~-5LKF)^unFEWG{}9wTFm
zIZD@837uyj*Ja5pY$t_~(+5&Ml0_KKXs~3ei{k$TITdaq8KoExbuy9u_6S;X_OfNN
zqk_iz62@pJ`5K7+!LtkJn=}OYTuXgu!jv=>{zZ3DcsV~%#O8Xu)GsrCCahe@8^Az2dJIlcHN^~lxyhn{SI
zWIHsS5@K+DZCzF0k?*?!v6)x51Y~6Z1PBYn7J1J;6LR;sfJ%o!)o-s6Bacu-WwNsJ79t2zzEBvPHWH*M;EhE$Zdod#2qp-SqyFiKgr|F_3$rL9c+6Zl9zecnDF@ncre~d
zf4wT2YNq_%0l9u4pWwJ;J~6tpg~2Fu1fnkl_LWRD18
zTeD8x*4>4^cuU!wc=KK6z==X#q-BQ3UHqX##J%Q~^@oJr6)*-eVA7U>N1vmNw3#g?
z{pTW%{1snsbM|Ed_|$gk0ARZ&y_a+6EB}3a!&N_vE-gvjJ?r97+FJGa2y8Rl`HT{Y
z)xjnCogjs<9=)`>ijg8Oyl_4prCZq8s{!Xn^-@%Uo-Pg|w(^HVcd@K3l3q7&84Qw9
zdKxg14+ImC#`@OAv{b!zlxH-HY!3#5ptw-Uw5`um$ERv#;b;`NlDwU?=)RaB#Q9dEbU>l>N}2I
z31YT&Pwz#74C{#{RIA&2=O{2f+dfI@V_uk42Dud6OXph0)|)pOir0TV}uN
z+1%$p)pVoi!+ToODdJub6o2Lmn=Np!Mt$u
z4CB;?N&>f|`_wjtNQX(BEeEZ4DZpEcDnC$SY-syN4{D2OFC=L_J%HKRt*WytIs(pz
zo)Dl*@wkFap_bhARI!mex)-WuDnv=y)5WV7sOB%kr!m|rW#p9?rHUVE9;MmR{RK_&
za4!gUcX1)FxxRPm3A*q-bJifQn?e#wbsG2OlDBMSf*;=25kVN60yl4Sapy2w!llhC
zkq2x*&wI99N~)48p@)4b4OCrX*>mxj)(K0or}72ZDMqXPGyq})FKyvJD`v{QZ-CX>X0Uh7qqv
zB_Mfg_4N7L)K@R)op`2irl{&E{$d^Nzry3F(Bi;tz({b?LG^rOBv?*`^On{^Jh7Wer}d6$_|
zoPN6DvUFF~YT#k1Em`^ljqIHo%U0WWQ`xC008A&-gr5}IOhBTu|Eyc*h_8g%}?Kf!5YElq#R)04cDme^bV=WV{wWMyGKE6s#=T`$c_d)vsUJ0{n6MX
z6!kGI&0J}rIVpNkX+qFI5kQi$=^JmbFYNHOZb7HjrRZ99cCb=ciO~Fy#`mt5F7#o2<|Sy-Ccrf6Y)X*P2VcC0jnb}
zFGwT%#Qn87_K^^qHr*wHBbqBz^Uh~Ey7!~k-*qRVYgM#ClX^?Vh&1_~P8PXMLT@Y~QK?7Lg1T$~M;p
z@y`YoRTm1FtVhfikw!-f#H73IgcGZV?E;76J<-js?vO%|DUFuUVt)WTK5!%R|L{K6
ziCfRk)!ldRy`{&NjR^JhL+JjfX;o{-r(XNA50W{Mfae|#F+K3j%MZ_%
zEFS!%?9MN=Ia&5(_70qQ&U===J`7K1U;shq^xy2xkJsT~>_mp~U#B1++wrRL77r1}A{%RbQQC5_I4bQK~A{fRo-q<3+D
zn6`ofP2g{EgWtS}Zk0$sK-71qnb#<1|2;0CQLEqNaFEdiE!Bq&hWQICq;q3rZ9^#l
z>tl|phiS?vJ@pXWF3dk1@7J(Q!$&d;D*2ZMutTkdj=*sxa`fDUF-)?ui$0UZcxgI*
zLER3Ghe?zV`mJ8_DFn$Kf#Zq(Qt8CJ+5*YfFVL5qH_i2l)oGZ9NVK4(nTbZvlhCOr
zn|_jES$=Ef`{d16U{pwA){48go}Lr8?F#3VT(SpMHK&vR17pOAN8h`Zct(=DODh5{
zV`_dV&Z=S6uzQ&f{5%r?VS|i^!H{#{I0+W50}nb197Y-zpC_t{g42Z571SZ)EY}PD
z%mPX6*l;4dg{XYQw9Nfl_@mcHJFz&ZpI84
zt9;ix!FVs4gSK8h0%o6Va))LOlHXt|&b+8ZK$|U@du$8PqKi#D%1RNC$YraK1T$vM
z!Jo`*wg#y#6yzvi1;TA=jNG3rCLe$@i;nzC$XmwUJoq^jtMjzZfth(P-2U&33PB$u
zQ8zTCooKg5zAEVuk1Dnpi%a;bd=O0WPcvVWEfMXUP2s5)VRlmmU>Pw3CfCGiG4yA#qf8e~2UBblm3BUv&fO4}Cx$rK|89i_#5*X&xB*2l=JE%u(*uQIlCG;P%vuePv5cIM?
zleFQO`s>}GWm%W%!KfX4#BlP73Gv-f-9*PEbZiu2B2*q*x_!CQuJFh$1fQbML1HT4
zExSUKqOY}*K&?OQ`twtEt0ONqOMeuT*GMsUV5&)U?U4+D4+r*(I^1Es2Vxzi4qu9%
ziL|R>9Gl{5f=nC5p#pe8^TXgdk&}}T$?TmOh&?a+GnoJ#6^8S0*nn%E*uMSCTW6@7
zF@SxYOIG5?tDk9Z@BB2{M$Cb#@lI!=$Xd7S*u&KsK@w6pBtYlcpeC7mhXa{>`va@W
zt2qa*7BBZ)upUn^Shfz=bn%%=Cde^;Pm5v5tundl~4VVOv|3>}3a
zzl*;tfTG)agc>#3yH9tHnw$gt23?ywO=H7a^YCi*g^%z&sT81Mxlx~3rv9!P(mddb
z3BBRfwD|~juU5Ypq{@Mv?`G0U*0k>yT(e9K{(vxu+P0~`^P&;-Egg*4_S@XIH(Xem
z2;;#Zj95IT@^s0uX+iUg!~xYhaiY$O}`d(u^9xw*+)1t4GSempbEN
zNfIu0-3>9hNv(28jOcoi-5TbCVcbpi3+YzrbR?2maGnnyF)4v3bVtzp!xGE+@#(q<4N87)
z7}S!4x=;>m8S}>1;F2SONI(w`5yS4&<<=B`#-kMcE#(<5jFA58GbAR`p9Hm#0BmM_
z`wjrZrHo$D2~B$bz!%KpIea*-J9#$haoYJRTJ-|ZoauD`aDJ$HW012J0l#*+5bav`
zcb>GF9fU@HsOPGaTs^iVd~KeshpN|w9b}T0sx$_AEB=Hu-#e-}&)W@M@vhBo?3wDI
zJv-%HKTNK7inH3Vp4(hPrZ+vdtFoQpa)BCD1<-~^
z=V<67m20v~we-|&yGEaPBbgnkZCCsIhF3i_>)14*9bcOU=L2iZ6KlyHzgQ^`RcvL=
z3a39G-fq(NIlG&~xgcf&UE}O$aUp_^R-N1xpJ_kr7i!l&=hBrRc&bW>J}#&P#YsLH
zai+SA{M}XY6ByBCxN=wlYv!tL8pJijteW{xss0q^+S)itGfmGJW=MTvyn~QmSaGBl
zxFjKcFq_8Ae7%-BLk^H6`$pl0!{OQqgRLNE`Zz%)H|Et-Hg9Bv>H5bVn^Dw3
z+;lj}<-mCAy1OZgC0(3tntJ{XK<3J=$inJ`n)UK+&vkvIX){t1X#T<;fhubINIxPI
zrft8pJL!vN;N33@^*ptCA%)L_G8g)KpB>=rWg|Y_cD?u}(E;!s{ct)`-ywJrA1~=6
zwfajaKwX%yOAk6B@;+IE<#!|tZVYVms{oSn;z3xskpDa<#)szQ%CUN=i|Ipc2I#-m`==zJG?P2zpR~~OKS?%oqrKG|t_N7_Io<8B^vzqz?Dg?}lIh=ZW^&kYBcv)wCS0Cglm3
zk24|T;x(!p%ZFF1e8JspVXZO}7^$X4R6K`1x-#||1jPXS*)Js99wKHy;Ao+v9Db`E
ze>7*8CHcAbV*`==iwurNK~0JMsKU)yn*wuHykCde9axTq{u
z1}Ai$utveBK;3-nNn+V8G|MAiWV(Dl(+!ZF>-N{S%_DIgmPSgZe4+a{x>?03Tb%Di
zc1TN${$&KplHbNVNF02HFIZbkYFcS`;ahMXXzdw50P-M%VXl2(Z
zn=mN3o4Nrb1`M6D_9rJ+m5?uzQ}t1x?OZiJrmZHJ+g~gao+4B_Raf-X*!fSvOIWE*
zSV3&CExiKRGT-I<<&o+VzIc=F`s=B$XTfF35#Q}ssR*QKX8EF_aYPL8AFc)UNi&ff^$pL<8a{Y{t
z)2baDYi6wy@2M)WEL3{+;tXHCw!hW%3llPmUC;?<8
z7F%#sS0DKa&8@nocKl$Z!dQEqJ5^p>(7alFxxN)p^1Hr-C6vv)|fVc-a83McV)RS|45;MX(NPmBXKjuSWkz
z!UU@;^=&*ArL=CEjyI;M`ug;DL@o9F$+<1oiHv&_ak5?ZMET-u@1jsr{@ARQV^v=a
zha*9hQ~DkExh8xB5Ha!&UajBfU=Q3G^mBT<)A2j=oV;-FTXslBzuf;TZf`TJKD)A+
zq6R13_<|tUCY1p)3(2fJ3lG^&_SeaBZ~oLYwF(#u;zglp(V7^D(*tv93BU%2Sy(N7
zRUsL6veY#F2_=Xfrk?V|DVSXm!}WD(SZMtdc8~%8%?EpwbfI_DzXOaLatjA3pr!_<0w|>^}=Rwd`dqB5MXscK3=XB8~S)%k8jHk{B*|G5A~e
zn!FP8f}Kds%$o&6tu_uU#q>`E(@BkYIZNuke5rcDBSI{4VU5?&SEK&s0!rd*A1@_y
z4AD#?5c~ILy>$=OY8iC8LeLmvjx^}jXWXEPwOCBBtx`%SlZjEYV1*rK?$nCo0*tAg2x=P3X=!O)g)WXq#%zYUp1y1Zx4qzsFGRD
ze1U83p1&nP%U_0fcgicaWa^X%ci7`qx37oZDjqpJM)rV71{~!jA(ez)d$yVWhG`bh
zc=j9o2&SXXHsZ8X6EtU5MBcz?khmmFIEPp?N7Wy?OIW=%$3xBY$mfsoj8$dCay99q
zzrF-C%1V1YG1jI15M2eHxXMEr$uoj?j;b})^0CV)tADr(m<8ycXPksELZ)o8Nil;6
z;G?{cV=Swe!r|`qz#{rLfy@@<9*mXEh}D5d4snU13-ifgnr7qrq2=khL#u0_IL?%9
zdOfu=FI1+zYND)ph%4tuPU2q-K-XB8N_{kv{wZYLVvBojd9S|SqKAS>KK>Mw2jkHu
zKXkI0z&Cn1o0+s(OoP@0fUz2^v#7-*+=6rt@0wD$RJ2s21aXMcBrAN8EiK
zsP1FX`%7!<@X{yyl=!Y=2-?~na`HY(dzA%iX01DN;
znVZVB8~SI2Q~~t2Rhy(4rRvZTbIu_9o}$=ynsB*Pd9DDacojp?SHbln)9ApZqI!ze
z88>x_=z@}gr}2el8_gewwNMe3pE$=Gds}~-adNDbN1Ti#+#;8SEf@58NfHwpdMe8C
z7JXP-IMUR!g?35=gULK13pL{sj}%Q)It;a@HOrHOd(E)_usXR?LFiTPf}C-u-qB
z;>X6nP1`C5vVO1hfH3jrxvcgg>krNfCx2#U876uSg)Wo^)y^0ZT$0n$L4}N^ZOO
z*&}{X`>Fv>Pms{h;dmzsE_}4F+y%WH+#$Xj@0H7o{KLDc>k-+f&I-l_KSyUwG8yJ-
zZr_p!_Zw_f0WAfliYjZDb|!38sD9tGgT6XGTn(XCMm!lD7h=t1zZ4w{EmokMo{9yb
zK?kaW(VufG9==CaFlVr|Hg@utQUY49CP*&$p{q&hxlJLm{m4=!IJnTYX?0lFzp!#!!WuceR`J75EELllU;ZGyRK~$COjMi^-^@gU7-mu(0O?(x
zfAUe^(w(Dc{StLR8jmrq45&nE2HCBW-Cnh9kTSJQE&2=tP3Y%C6^qiIeU$R}V^%{J9
zDg%EzeE)a(6Vs;#BN@UQQ}|ypTMZiO;mtd|wVM0*w_ECQMk{PwiLCZrSPgOMF%eo;
z737*DvZH>)7G-Sde1;BA)r-y{W1MF>P#CdWEKIfs1t9vaUz6PwX(G;cb9^P@ICCh{
zjaqANBk0t9)SL;{Re}2?>{v6G2nUL`#VK+Q@avX;=Hz=V
zID9^aZXJDi7n@reZf@pU65GeUCAmP2LRG2iUt|tQ>B)9rJvMbr7ijSyp*LM+&h10$
z!waHf{HA`Ty?^fGmDygOYX*l&p+;WB3F3)nxTK`^R?jNOEVc2_Ecso5#t!COP0k~Q
z1c}rP0{(N$+qsOHcB@{9x466)bBF?q<+jFvgY>07Gk9_4VaeUxI6~ddlCLD=o79Q!
z@O>~OL*`S;T+I6Tk%6C3_s}B
zLl{chbh%ypEVcw3rqc6;VnPxvq?}M@nADI8&Qj7WB*Y)63IwB4V4_u-%2eH!g?=cl
z#u(i^+o-rBuQG<25}lO_I>>j$#-i|hMd#NJF9jtHiM4_f@-;w&AWweR!70+weE(o!
z#(6y^++1U;PKi=@K?G)P6aqEMS#N6O^aR7U9^jo1!c9sDL(dpQ|Ex5GVECDr;BSeS
zJtF0ak3$R0f0&)PwnLwQ3&DhZiLX=tjHa_(vi8c08HfD9Fj5j$@6*z4q24=cTSXVSy=peNiN<+o5(3Vx(`rMgxIsx2bBWt
zn>qG0ONHCd=?+`^(FF984FQao9rmH(jP$agPi%boujL7DmHItki=likVnmJwzM$*7
z?sLvnai88f$5p91PI6tCGP~@*^C>4~;q-Ee|$bF{IXnD
zE4wkn(lJ4ZEp2Me031%ee_YQCE8a*}o5%Uy;Bf>$CA-4rQF*yoeKW2QKRCP$SjEK1
z-wAmknl&d06H1}upPEw8=vW;3&c}5**0KG!gF9W8>YoS=>I^lfgO8XvR=I^!8o@6e
z2zejxaC#KS&(Vt_&qWsK=?MJCWRRdN@oLkcOcyA7(u?^@^J|FakM3$~MXkWwso+dU
z;tB{+?zFk>;p)vRtLtJf5Jeb3XD65;ypRicbdTdV%ofR0wO~EG{C4znnwz119x^y8
zO(mq&^rVKDKt^#H(ve>{AIy
z@#h=aPV$e{Y29ka6k76)e-rjBWQ%%YxD_zhElfR(srT`z7PHmq{hwr5{BwKrx6<7I
z|MF~tQ|T@*@~YZ4ewv$Pce`(Ou_OuW3GSAay7@i(?cgtqoAGA-ksf+<&ZuES(~2Ze
znz0H96b3a~_fvV@0|nT(l3T)ViyR4>^`0(%=EwfRL#xsW3{-YHxRd?G6i8H7s@d)f
z4eQK!_euDdx^3rN-p};dm&M+ny(dd?slbs~J-Mu2o1KYn>sYZ6UnCdqSL?YnEKAVk
zBu1{IAzk1n$Zde7z;J_TE{aQ)EZMvZa}LsX8!m&0^4qVAbSh%CuWR)XfbX{k1oOT=
zKY4RY(n<*i=4*;d`U5EjCiOjgxy!`&*eZ_WUp{R&2)e6DJZ#krP;7NY6bh9fkPm=9ISvP_}#H2)5}d
z_o*nAu-}OnGQuCgnkFyaXqv`FaxkUq0UV;7iE($>Bog$a$ZdoPbz>MFZCYLoNffr2)Av^Q0Mbi!fsBj@PK*&K%Q^wiY@!2KOG0lL7p
z-U{PjQ>1UeU_fD(bOrXLiDG?mAs2pAV9z96UoWL#U?mfdjD`8=Ib9(y(^Yrb!|{-P
zjx1YNCsjLJh&C;riOPX#W_0)pzm9u^HclVmQf^8O9%NLl;TQ28=U5W}{@I)j3RmT^
zC{`w-p#8=`iUHS@QdM(!p#jU5J+`U4HVnj6ERXEc^!(LvT<&`;cemSz!q`vTHSJRk
zK{XxTUio^uN3}Hdl$dU46D>ysD=Wt9xGmyyL2RnZT@Yj%-oH2Cq
zn8m!~(Z|!NFZBhp(#H>vb$CETcCX6R$=Tj0fVy^J`bh*VUBt%hi)itf<=i>jNf*#Dx!?
zmU;xYl;4S0=)lgOTNMq<@$KLL8M%^9Id~M-IY7-F39iIyAP=ZOUPUg}l`J2VrC&HD
z*?WpLp;$Mkacp#EOkWR)pT7X6{ss$!F*I~39h&o_mOM$;fY;2~7iOBofyiGY&9Q*}
zB8tzCPD=xH{i9Myn=s8-2Fs1VKz|FqJmB6Y)qQna9|6N^RA4Y0USA)-e#?rq7pxJx
zJ2c~keMG$c^^sEWZC0=-(TiK7Obvdq*-A%x7BXti5S|zh!O&h-@b}i-!ABg1Taur<
z_qUI*wLShpkNk(hj*O`kJXsO_gYllMfvc!^-^w+Vnn6E%0lBTI*+*Id>rEte{M2gk@VjG?~Wng5^?=uZ~xx-|u^TK$)a~b`locek$+i$>mgyr3QZCuwZ#&hY9;u39DKvS
z5c503HPB0;0eEn(b31o3PA)Vh_Mf0UZ>k;t>ci)!@XTE~Flpz@l^p;Z?(&3X^WE!U
zk_@ZhB!v~zN;&q8Ry{+fw(0ze3)D-X=>?(pKNH2aoh5&I0fthF%
zcKj*>RLLs++~7Cw&skkNxN03#`&m#7!2wBbIlt^a;r5lKB(PxjOZ^3)A`VrMwe
z?~v>%H!?bmBmhV>0PC_Q5-nuVJ{h9U%y(q|hP|XF@*u)v7M!;UP4ra(zXa^H*L(Q%
zD}Wuc=(_>ExMNRtHaKPZ8`GLk4Yn(sXkz4tRKW9P9*qCfe$?(*yHkm^LptR;cTBJ5
zK;K(tI-k{~Rc|El2w2BVO+brg=`8&BqWI%nui9O%YM=EN%IN<+)KBa42iDZBF`DNJ
zdCg5kFDYB_H{q1CAAGbI->j-TX-}I3loHcp@12_hIn&c)Ne`djQ$E0}`*&heA^6;Y
zmuA5Tq@v2mnNKt_MH6+!`O2T`tn%fi`+VV$R<)$d6(FIK;9_CgzD-8sFI&=aZFSi+FsGi4z^E?-0Ob3@gODw
zIGIT@5vJUeesAihwX@)4Waha^WhRBeO&?`+z}0C?`;S!$!ZtD0bsOubzy?UJC3zcn
zLmQ+--2-iP{?5ED6?JV=Q?owCF%oLV|3OW7%k=7o^yZ%HU2;P5jK+D%$g(1@yeBVT
z4ui{9Aiqe}AF%Czrs1WXt;e5-`~o|qFsTNJ!=K-o)wdwicG+FUAkkn?$`yKMo-jjg
zWFY5aMz)uhr81=EzMsF{;QrJ7l@)V*;nj2OG;Yz8UZ(1Z(bTsY1#PD`-vd_Y;H#i`
zr&mWuvaG+XNolTx90aZfBjZMu>8cLO`V}kF(dAScX1DK$&mGpf0slqkcEx
zWI4&%v(1!*y}3_j$oC#%^@7}o#h<~SxwFRh*=$RqyAFR!@ze--D5ajrOo@o1Wv512
zm?Oz|b!T=L7(Mg(6)97)e#W5!Ze~aaE=3XNgmD5sUB=>sr;H|fUZrs(8gJs)lM>JD
zOGGQ=+ZdU?hnzMttVAx+HsP+wHFp2(KenGTl$@Zr=QF*5MAesiRyOtvq97qCbSO(-
zOkA`uTFSQOv0t36T5K}3?Fa)xZK7@c~bX^Y}$IX}{0sztA2LE?}n4KJ7T#5c&75%e%
zpST*jJR|uZOiwST*E|zdzV+_O$umJC{{e??02km#xpzm$^Fq1XCYRDGIP>waRf_$9aF^61Koa>cm&nUZJe(aX*a$e|W`AyupBeG!O#k4@uqyh?
z2z7T>t%*+Uk6Tt}gYoKv7171k4VNy-n5Gp7Pl(`KSz;N>t-x)euywIulH0u*suB2d
z#HRe<_{UGtC?haI5S02D*eXkPE(Mq}{anHX2H3bw0TNHAn?>sP`cL+qluT;&Htwfl
zvK{$fWAKjU&mo5HxrPIUkM}*8TNbPm#s6D8VZ+g6*y+A_{8joB9T)N4E`}I
zQ~oWZxZ6moH*QY*em8l6|Rfja5)KDRu-
z~t%$5db7+a{!BlgWK=I!`rPs>Lu83;?)roSX30_HZ`BOdwLziPQnOJCP*
zh*#i$g^FW9$^5{a7WJ+e!Z
zj!2EP`;Y^NGt=L2jtJ@*#5|tc^Qj}1K7EqEX*vQ|-I9Q<
z=~eMOP`BPVr^XOWe*qeTLwxgiW^AmNydbAWQM{t-@Nl6P9UKF6sYfnb!Yz;ee!C8U
ztlG~eILhHd7&tRMK#dmI#XCosLt`6(wAUjJw2^H!(|^7~nnHpXA#&Zd(oa6^Pmva*
zPiZd?;&yRRgTwYAMz_|VZchE0ABSOzx^u!FG6@t?=-}Snmi=8$-YEcgTlR2kqp~bq
z!+HBzoC=1j8%CO?-s4Dc*yJ-^Xg-cvz^DoH_C3zpOnd4$4{%4QZf+U=CWKlz>^q{+
zOK|Y3tNj#;$03Rk_4GQAEBrFrrLa6FCq2IsW|+jQWJhJX-ZhLtc;coJ^>MII((*=q
z=uI~%cD|T+!$2Ocu<{%l3r<8aY76Fc@S{vs+T~tH>v}wBR7I#`woLFFuz7o=Id7fB
z^Ri|izA7kf-pOe}NiHz$_z|mF4QLJCs`1aL1ZsbK?#aFcK4lOs#s)~{#q*(yD%nwa
zm64a#e(Kfew4-k13*SwtDPRlJsnFOY`~6uzeFIOpT|vmd8gz%rhr87R8o6^2kia(~
zRbg5$+ss-cZm<1r_cSiT@T3bwAEQ~a-fq-hw?@q(=66ompU?bK?B5el)-y>CkvM~W
zs9335U9dK%?VEFZu&;4oU`Qija6ER`Mjk*};8iL@YIb)XLyfp7bnby{EZfe}-E!`f
zRMS-x2GlkB2UQZqT
zzoJMI7YmV(P?EN7lZX1ZuBb187mm;C)p*scCZoNXI4mPB^PV2k*wku{#wg7l}
zgKbTz7m~qCY?~4HV&0c6{qj4_;|sFnH@AHwqcg$Y7g$#%p{JezWKaQcPCo;!=eG2E(#_Oy`KDrvn{N)^eePDnosF(r4SGx>W|=p!
zNMBOFYnmhv?$19%m%4d0lN!mAPk)l6&WC_c3b1ylxQ;1U`lH98bFa$@_dXntQa4Ua
zUb!QXohU@FB5Q0;QGhT7;T@EW(L6acVVrGLXeo5Pr_W%GWVc#sLIA2@Rud~=8wKIu
ztUeai%ntarwQ^_;uW6PxB3);cGp^!7UI1krz#%^&i!dTEo6C(86VV^N>b~9*$-8EA
zDBddTSY>}9(<(eA$1YE#$We-eVRwox=hdiVHOeN0@(t?LQ8Mr2I
zL6S0YuIyoDxA*JJj~4<9J?2p72GW0AP=+}m%`qU8qf%?SL1r7Wgb7(33{ZSW54ySK
zi^vwgrc2Dl)Junnh_F0M=`nM^&v86*PQ0C!a~qgVW-dC7_6U9Wo8KW+-tIqo#thb(
z8aPqC2{80Z;yQjm?EFExnWw(ycQ;}O-+XH?Pn2#@&g4>(0lRpC$zPLduQjI#-KjWa
zp9tZ}aXRM_U%JTsciT#g{5Yq-`41Cwg1cvc3PP<2KTk!rUTGGr^$dOSS}0Ya1NC|#
z#(KD8x0CpH2uMkKNrY{wwcMKTi;ey<9LbJ*!-jN?iO4Ut)aQ6r{Hk+B*B%y*RGuPo
z<^r|o+%}Lf{cH$TUzXtJH<`l-6osCHZ==M|{b29K(xMw%DBgaNUTR3%9M_n6QoOdu
zW|xHVH&O}%&fG?@&);+PTFlTUuGW1@tLxAW^}r_{A`UZX(am-Cz_A`l2WI>@{TS>i_kfj@VV8ECX2PSSvIHxqBI(_
zuXRzB(lnlC|8PFpNZb@Er5pR(b?;@q;xFYW*Kuw1xwGeFT%dl7?citwTrAU+B}Cn+
z`wK~|Mt5`yQf$v>b-H`?uQhv-aFQ{pNBQsJM41284<#tK#Upl$I0_U#dGi01>9k_s
z#OvNdELr1Fh_;EI`_12by+@Rth4}tH?NA}UCX6g629e{pIT7-P;2Eol7JdyRuYiuV
zcV(%hxszIX-R}w={wdtMG{6jDr}08R_+zDed%nAi7hSJ|tckS;AEM1_A(&CSM{T-t
zl=oqe>c@B7v$pd1WuBYg?~`dd8;lz*zD9U_`!CW~wuMjpRHHk~Pjp+6QD%CV|XR`47h1E!mVGzsF_oChQ@#DLCBoJAbNaqhgkitojY53
zUhsuv43W9s$^s<=YxP-`=8ROM9ALRpnMZT1wZ~tPNS^qa_?j%F6%G={ACtYg*_GYm
z!oQ>b+yXxy0
z^EhJ8U7*pL>FBCq{C!wV!l`EH_QKQL3(+Fe?&!Nhh$mkQt2@?Gel{WhUx7)T`-{XD
z6bI(ywnl@Sh{IpC8{;WYuVOb|QT|;{+m|tWH--?tir)EShUq>d#4py(8A~;d?yt(?8M61_kHc%7@$iS$9