Commit b2e0661
Paddle Tensor Operation Library initial implementation (PaddlePaddle#34425)
* initial tensor design & sign kernel demo
* add move constructor for meta & add lodtensor
* add dirs & sign xpu kernel
* add mean cpu&cuda kernel impl
* move sign & mean xpu & npu kernel
* add selected_rows basic impl
* refactor design, BaseTensor to DenseTensor, etc.
* add scale mkldnn kernel
* polish xpu & npu impl details
* fix mkldnn reuse compile failed
* change tensor operation lib name
* rename util filename
* add more comments
* change TensorImplInterface to TensorInterface
* add kernel key and factory
* remove MKLDNNTensorMeta, add MKLDNNDenseTensor
* change XXDeviceContext to XXContext
* add base kernel registrar utils & test on sign
* replace boost::any by paddle::any
* fix several ci failed
* fix npu compile error
* add ordered map util
* fix multiple ordered_map compile errors
* move dev into include dir
* support sign op in static op run
* fix static op run error
* fix new executor compile failed
* add dygraph branch & remove sign_op.h
* fix test_infer_no_need_buffer_slots
* fix rocm compile link error
* fix unitybuild error & clear glog
* fix npu compile failed
* skip quant trans test
* fix part windows compile problem
* fix xpu enforce error
* fix inference test failed
* remove ordered_map to solve quant failed
* fix part of rcom compile faild
* add more register kernels
* revert scale kernel temporarily
* fix code format error
* add new kernel registrar marco
* rename top to tcmpt
* revert xpu, npu, mkldnn impl & remove op def
* add kernel args parse functor to auto parse args
* revert some change & add scale kernels
* add op proto in dygraph kernelcontext building
* polish kernel dispatch logic & nameing rule
* fix scale kernel match error
* fix scale test failed
* add mean API and unittest
* test mean api success
* add branch to solve compiled error
* skip clang format error
* add mean skip rule in op_library
* add dot kernel, api and unittest (PaddlePaddle#6)
* remove old kernel and add symbol link
* fix dot compiled failed
* add merco for module declare
* fix npu and xpu compile error
* revert sign, mean, scale, dot kernel removing
* add comment for keeping old kernel impl
* fix mutable_data error
* fix bfloat16 conflit
* fix inference undef error
* adapt to msvc compile rules
* polish comment for template inst
* add cmake template instantiation for win
* fix backend to place device id bug
* fix ifdef error
* Op2functor (PaddlePaddle#7)
* add kernel args maker class
* make args maker non-const
* remove debug log
* modify codes by review options
* split constructPrKernelContext function
* fix output name bug
* fix test_mean_op test_sign_op failed
* fill_any_like kernel refactor (PaddlePaddle#10)
* fill_any_like kernel refactor
* remove useless code of full_like c++ api
* skip dtype for fill_any_like
* add attrs for kernel key constrcut
* add use_pt_kernel Flags to control whether to use pt kernel (PaddlePaddle#13)
* add use_pt_kernel Flags to control whether to use pt kernel
* change the default value to true for cheking pt kernels
* fix mutable_data cuda place error
* move high level apis into hapi
* remove selectedrows adapting temporarily
* Support Scalar in Tensor Compute Library (PaddlePaddle#14)
* fill_any_like kernel refactor
* remove useless code of full_like c++ api
* Support Scalar in Tensor Compute Library
* add scalar in dygraph and static graph mode
* keep the basic type for attr, instead of using scalar for all
* merge the code
* remove mkldnn tensor & polish details
* use flat_hash_map and small_vector in kernel factory
* Refactor flatten kernel (PaddlePaddle#12)
* refactor flatten kernel
* update infershape function
* fix compile bugs
* fix bugs when merge
* fix compiler bugs
* fix bugs when run test_flatten_api
* fix bugs when run test
* Revert "use flat_hash_map and small_vector in kernel factory"
This reverts commit 2309149.
* Move cpu, cuda and other device code into kernels (PaddlePaddle#15)
* fill_any_like kernel refactor
* remove useless code of full_like c++ api
* Support Scalar in Tensor Compute Library
* add scalar in dygraph and static graph mode
* keep the basic type for attr, instead of using scalar for all
* merge the code
* start refactor matmul
* move cpu, cuda and other device modules into kernels
* merge code
* polish code in operator.cc
* Perfect unitests (PaddlePaddle#16)
* perfect unittest
* update license
* replace with flat_hash_map, small_vector (PaddlePaddle#19)
* fix small_vector build error on windows platform
* replace with flat_hash_map, small_vector
* remove todo
* Perfect unitests (PaddlePaddle#20)
* perfect unittest
* update license
* fix bug when run tcmpt_utils_test
* refactor execution adapting impl
* fix insert conflit
* Fix CI bug of test_yolov3 (PaddlePaddle#21)
* fill_any_like kernel refactor
* remove useless code of full_like c++ api
* Support Scalar in Tensor Compute Library
* add scalar in dygraph and static graph mode
* keep the basic type for attr, instead of using scalar for all
* merge the code
* start refactor matmul
* move cpu, cuda and other device modules into kernels
* merge code
* polish code in operator.cc
* Fix CI bug of test_yolov3
* add the tensor base class, test=develop (PaddlePaddle#17)
* update the tensor base class, test=develop
* remove two funcs, test=develop
* update the error msg, test=develop
Co-authored-by: Chen Weihang <[email protected]>
* [no-verify] commit backend and tensor signature changes
* Rename tcmpt to pten (PaddlePaddle#23)
* rename tcmpt to pten
* update omitted files for rename to pten
* update omitted file for rename to pten
* remove k of all enum var
* remove kernel_instantiate (PaddlePaddle#26)
* remove symbols and spatial_tensor
* change common to functions
* readd share tensor impl methods
* add a candidate dense tensor class, test=develop (PaddlePaddle#28)
* change all Pt to Pten
* resolve conflit with xiaowei
* Op2functor opt1 (PaddlePaddle#27)
* replace to small vector and change to const &
* add std::move
Co-authored-by: Chen Weihang <[email protected]>
* polish kernel factory and kernel registry
* fix operator test error msg mismatch
* remove tensor signature and backend set member
* move scalar and polish enforce
* revert dtype layout change to fix error
* fix enum operator override error
* add several base unittests
* add pten utils tests
* polish some details
* Dev/op2func refactor 3 (PaddlePaddle#30)
* add a candidate dense tensor class, test=develop
* remove TensorBase::backend(), test=develop
* remove some ops, test=develop
* cherry-pick the pr of tensor meta, test=develop
* moves the dense tensor and some ops, test=develop
* update the linalg operator, test=develop
* update other operators, test=develop
* fix errors, test=develop
* fix bugs, test=develop
* try to resolve the problem of windows ci, test=develop
* updates codes, test=develop
* fix the tensor_utils.cc, test=develop
* modify the dense tensor, test=develop
* fix the data type, test=develop
Co-authored-by: shixiaowei02 <[email protected]>
* polish some details
* polish kernel signature details
* fix a bug about offsets of the tensor, test=develop (PaddlePaddle#31)
Co-authored-by: shixiaowei02 <[email protected]>
* polish some details
Co-authored-by: chentianyu03 <[email protected]>
Co-authored-by: zyfncg <[email protected]>
Co-authored-by: YuanRisheng <[email protected]>
Co-authored-by: 石晓伟 <[email protected]>1 parent 80fa602 commit b2e0661
File tree
147 files changed
+8516
-195
lines changed- cmake
- paddle
- fluid
- framework
- imperative
- inference
- operators
- platform
- pybind
- pten
- api
- include
- common
- core
- utils
- hapi
- include
- lib
- utils
- tests
- infershape
- kernels
- cpu
- cuda
- functions
- eigen
- mkldnn
- npu
- xpu
- tests
- utils
- python/paddle/fluid/tests/unittests
Some content is hidden
Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.
147 files changed
+8516
-195
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
116 | 116 | | |
117 | 117 | | |
118 | 118 | | |
| 119 | + | |
| 120 | + | |
| 121 | + | |
| 122 | + | |
| 123 | + | |
| 124 | + | |
| 125 | + | |
| 126 | + | |
| 127 | + | |
| 128 | + | |
| 129 | + | |
| 130 | + | |
| 131 | + | |
| 132 | + | |
119 | 133 | | |
120 | 134 | | |
121 | 135 | | |
| |||
310 | 324 | | |
311 | 325 | | |
312 | 326 | | |
| 327 | + | |
313 | 328 | | |
314 | 329 | | |
315 | 330 | | |
| |||
482 | 497 | | |
483 | 498 | | |
484 | 499 | | |
| 500 | + | |
485 | 501 | | |
486 | 502 | | |
487 | 503 | | |
| |||
572 | 588 | | |
573 | 589 | | |
574 | 590 | | |
| 591 | + | |
575 | 592 | | |
576 | 593 | | |
577 | 594 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
1 | 1 | | |
2 | 2 | | |
3 | 3 | | |
| 4 | + | |
4 | 5 | | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
197 | 197 | | |
198 | 198 | | |
199 | 199 | | |
200 | | - | |
| 200 | + | |
| 201 | + | |
201 | 202 | | |
202 | 203 | | |
203 | | - | |
| 204 | + | |
| 205 | + | |
204 | 206 | | |
205 | 207 | | |
206 | 208 | | |
| |||
394 | 396 | | |
395 | 397 | | |
396 | 398 | | |
| 399 | + | |
| 400 | + | |
397 | 401 | | |
398 | 402 | | |
399 | 403 | | |
| |||
456 | 460 | | |
457 | 461 | | |
458 | 462 | | |
| 463 | + | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
29 | 29 | | |
30 | 30 | | |
31 | 31 | | |
| 32 | + | |
32 | 33 | | |
33 | 34 | | |
34 | 35 | | |
| |||
49 | 50 | | |
50 | 51 | | |
51 | 52 | | |
| 53 | + | |
52 | 54 | | |
53 | 55 | | |
54 | 56 | | |
| |||
1120 | 1122 | | |
1121 | 1123 | | |
1122 | 1124 | | |
1123 | | - | |
1124 | | - | |
| 1125 | + | |
| 1126 | + | |
| 1127 | + | |
| 1128 | + | |
| 1129 | + | |
| 1130 | + | |
| 1131 | + | |
| 1132 | + | |
| 1133 | + | |
| 1134 | + | |
| 1135 | + | |
| 1136 | + | |
| 1137 | + | |
| 1138 | + | |
| 1139 | + | |
| 1140 | + | |
| 1141 | + | |
| 1142 | + | |
1125 | 1143 | | |
1126 | 1144 | | |
1127 | 1145 | | |
| |||
1159 | 1177 | | |
1160 | 1178 | | |
1161 | 1179 | | |
1162 | | - | |
1163 | | - | |
| 1180 | + | |
| 1181 | + | |
| 1182 | + | |
| 1183 | + | |
| 1184 | + | |
| 1185 | + | |
| 1186 | + | |
1164 | 1187 | | |
1165 | 1188 | | |
1166 | 1189 | | |
| |||
1208 | 1231 | | |
1209 | 1232 | | |
1210 | 1233 | | |
1211 | | - | |
1212 | | - | |
1213 | | - | |
1214 | | - | |
1215 | | - | |
1216 | | - | |
1217 | | - | |
1218 | | - | |
1219 | | - | |
1220 | | - | |
1221 | | - | |
1222 | | - | |
1223 | | - | |
1224 | | - | |
1225 | | - | |
1226 | | - | |
| 1234 | + | |
| 1235 | + | |
| 1236 | + | |
1227 | 1237 | | |
1228 | | - | |
1229 | | - | |
| 1238 | + | |
1230 | 1239 | | |
1231 | 1240 | | |
1232 | 1241 | | |
| |||
1243 | 1252 | | |
1244 | 1253 | | |
1245 | 1254 | | |
1246 | | - | |
| 1255 | + | |
1247 | 1256 | | |
1248 | | - | |
| 1257 | + | |
1249 | 1258 | | |
1250 | 1259 | | |
1251 | 1260 | | |
| |||
1256 | 1265 | | |
1257 | 1266 | | |
1258 | 1267 | | |
| 1268 | + | |
| 1269 | + | |
| 1270 | + | |
| 1271 | + | |
| 1272 | + | |
| 1273 | + | |
| 1274 | + | |
| 1275 | + | |
| 1276 | + | |
| 1277 | + | |
| 1278 | + | |
| 1279 | + | |
| 1280 | + | |
| 1281 | + | |
| 1282 | + | |
| 1283 | + | |
| 1284 | + | |
| 1285 | + | |
| 1286 | + | |
| 1287 | + | |
| 1288 | + | |
| 1289 | + | |
| 1290 | + | |
| 1291 | + | |
| 1292 | + | |
| 1293 | + | |
| 1294 | + | |
| 1295 | + | |
| 1296 | + | |
| 1297 | + | |
| 1298 | + | |
| 1299 | + | |
| 1300 | + | |
| 1301 | + | |
| 1302 | + | |
| 1303 | + | |
| 1304 | + | |
| 1305 | + | |
| 1306 | + | |
| 1307 | + | |
| 1308 | + | |
1259 | 1309 | | |
1260 | 1310 | | |
1261 | 1311 | | |
| |||
1562 | 1612 | | |
1563 | 1613 | | |
1564 | 1614 | | |
1565 | | - | |
| 1615 | + | |
1566 | 1616 | | |
1567 | 1617 | | |
1568 | 1618 | | |
1569 | | - | |
1570 | 1619 | | |
1571 | 1620 | | |
1572 | 1621 | | |
| |||
1588 | 1637 | | |
1589 | 1638 | | |
1590 | 1639 | | |
1591 | | - | |
1592 | | - | |
1593 | | - | |
1594 | | - | |
| 1640 | + | |
| 1641 | + | |
| 1642 | + | |
1595 | 1643 | | |
1596 | 1644 | | |
1597 | 1645 | | |
| |||
1614 | 1662 | | |
1615 | 1663 | | |
1616 | 1664 | | |
1617 | | - | |
| 1665 | + | |
| 1666 | + | |
1618 | 1667 | | |
1619 | 1668 | | |
1620 | 1669 | | |
| |||
1628 | 1677 | | |
1629 | 1678 | | |
1630 | 1679 | | |
1631 | | - | |
| 1680 | + | |
1632 | 1681 | | |
1633 | 1682 | | |
1634 | 1683 | | |
| |||
1711 | 1760 | | |
1712 | 1761 | | |
1713 | 1762 | | |
| 1763 | + | |
| 1764 | + | |
| 1765 | + | |
| 1766 | + | |
| 1767 | + | |
| 1768 | + | |
| 1769 | + | |
| 1770 | + | |
| 1771 | + | |
| 1772 | + | |
| 1773 | + | |
| 1774 | + | |
| 1775 | + | |
| 1776 | + | |
| 1777 | + | |
| 1778 | + | |
| 1779 | + | |
| 1780 | + | |
| 1781 | + | |
| 1782 | + | |
| 1783 | + | |
| 1784 | + | |
| 1785 | + | |
| 1786 | + | |
| 1787 | + | |
| 1788 | + | |
| 1789 | + | |
| 1790 | + | |
| 1791 | + | |
| 1792 | + | |
| 1793 | + | |
| 1794 | + | |
| 1795 | + | |
| 1796 | + | |
| 1797 | + | |
| 1798 | + | |
| 1799 | + | |
| 1800 | + | |
| 1801 | + | |
| 1802 | + | |
| 1803 | + | |
| 1804 | + | |
| 1805 | + | |
| 1806 | + | |
| 1807 | + | |
| 1808 | + | |
| 1809 | + | |
| 1810 | + | |
| 1811 | + | |
| 1812 | + | |
| 1813 | + | |
| 1814 | + | |
| 1815 | + | |
| 1816 | + | |
| 1817 | + | |
| 1818 | + | |
| 1819 | + | |
| 1820 | + | |
| 1821 | + | |
| 1822 | + | |
| 1823 | + | |
| 1824 | + | |
| 1825 | + | |
| 1826 | + | |
| 1827 | + | |
| 1828 | + | |
| 1829 | + | |
| 1830 | + | |
| 1831 | + | |
| 1832 | + | |
| 1833 | + | |
| 1834 | + | |
| 1835 | + | |
| 1836 | + | |
| 1837 | + | |
| 1838 | + | |
| 1839 | + | |
| 1840 | + | |
| 1841 | + | |
| 1842 | + | |
| 1843 | + | |
| 1844 | + | |
| 1845 | + | |
| 1846 | + | |
| 1847 | + | |
| 1848 | + | |
| 1849 | + | |
| 1850 | + | |
| 1851 | + | |
| 1852 | + | |
| 1853 | + | |
| 1854 | + | |
| 1855 | + | |
| 1856 | + | |
| 1857 | + | |
| 1858 | + | |
| 1859 | + | |
| 1860 | + | |
| 1861 | + | |
| 1862 | + | |
| 1863 | + | |
| 1864 | + | |
| 1865 | + | |
| 1866 | + | |
| 1867 | + | |
| 1868 | + | |
| 1869 | + | |
| 1870 | + | |
| 1871 | + | |
| 1872 | + | |
1714 | 1873 | | |
1715 | 1874 | | |
0 commit comments