refine paddle inference api#26774
Merged
jiweibo merged 8 commits intoPaddlePaddle:developfrom Aug 28, 2020
Merged
Conversation
test=develop
test=develop
test=develop
test=develop
test=develop
test=develop
test=develop
|
Thanks for your contribution! |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
PR types
New features
PR changes
APIs
Describe
Another copy of pr #26098
Due to historical reasons, there are many kinds of API in paddle inference at present, which is a great challenge for users to use. Therefore, we re optimized the API. The details are as follows:
We support forward compatibility, that is, all previous APIs can be used, but will be officially removed in the next two versions.
In this version,all interfaces in paddle_infer namespace.
Use
paddle_infer::Configto do inference configuration.Use
paddle_infer:: Tensorto represent input and outputUse
paddle_infer::Predictorto represent the inference engine.Use
paddle_infer::CreatePredictor(Config&config)to create inference engine.