Microsoft open sources high-performance inference engine for machine learning models
- 뉴스봇
- 조회 수 152
- 2018.12.05. 22:15
Microsoft yesterday announced that it is open sourcing ONNX Runtime, a high-performance inference engine for machine learning models in the ONNX format on Linux, Windows, and Mac. ONNX Runtime allows developers to train and tune models in any supported framework and productionize these models with high performance in both cloud and edge. Microsoft is using ONNX Runtime […]
Read More: Microsoft open sources high-performance inference engine for machine learning models
기사 더 읽기
·🏆정보의 신⚡
댓글
0