Pages

Dec 4, 2018

Now Available: Open Sourcing ONNX Runtime

Today we are open sourcing ONNX Runtime, a high-performance inference engine for machine learning models in the Open Neural Network Exchange (ONNX) format. Binary packages for Windows, Linux and Mac with APIs for Python, C and C# are available today. ONNX Runtime is the first inference engine that fully supports ONNX 1.2+ and delivers an average of 2x in performance gains . It is optimized for speed and extensible integration with hardware accelerators. Leading companies such as Intel and NVIDIA are actively working to integrate their custom accelerators into ONNX Runtime.  

What is the ONNX format? 

Open Neural Network Exchange (ONNX) is the basis of an open ecosystem of interoperability and innovation in the AI ecosystem that Microsoft co-developed to make AI more accessible and valuable to all. An open format to represent machine-learning models, ONNX enables AI developers to choose the right framework for their task and hardware vendors to streamline optimizations.   

To learn more about ONNX Runtime, read the blog


by via Azure service updates

QuickBooks Self-Employed

Bigger tax refunds. Better organization. Manage your deductions with QuickBooks Self-Employed .