Tuesday, 12 September 2017

Microsoft, Facebook Unveil Open Standard for AI, Deep Learning Networks



As artificial intelligence and deep learning have spread, a wide range of companies have announced that they will bring compatible products to the market. Everyone from Google and Nvidia to AMD and Fujitsu have thrown their hats on this particular ring. But software that runs on deep learning and AI-specific hardware is still typically a custom solution developed by individual companies. Microsoft and Facebook come together to change that, with a new common framework for the development of deep learning models.

The Open Neural Network Exchange (ONNX) is described as a standard that will allow developers to move their neural networks from one framework to another, as long as they both adhere to the ONNX standard. According to the joint press release of the two companies, this is not currently the case. Companies should choose the framework they are going to use for their model before they start developing it, but the framework that offers the best options for testing and tuning a neural network is not necessarily the frameworks with the features you want when you bring a product to market. The press release states that Caffe2, PyTorch and Microsoft Cognitive Toolkit will support the ONNX standard when it is released this month. Models trained with one frame will be able to move to another for inference.

Facebook's side of the publication has a little more detail on how this benefits developers and what kind of code compatibility is required to support it. Describes PyTorch as built to "push the boundaries of research frameworks, unlock researchers from the constraints of a platform and allow them to express their ideas more easily than before." On the contrary, Caffe2 emphasizes "mobile products and extreme performance in mind. Caffe2's internal is flexible and highly optimized, so we can send bigger and better models on low power hardware using all the tricks of the book." By creating a standard that allows models to move from frame to frame, developers can take advantage of the strengths of both.





There are still some limitations in ONNX. It is currently not compatible with dynamic flow control in PyTorch, and FB refers to other incompatibilities with "advanced programs" in PyTorch that it does not detail. However, this early effort to create common ground is a positive step. Most of the ubiquitous ecosystems we take for granted - USB compatibility, 4G LTE networks and Wi-Fi, to name a few - are fundamentally standards-enabled. A silo'd ir-it-alone solution can work for a company that develops a solution that it only intends to use internally, but if you want to offer a platform others can use to build content, normalizing that model is the way to encourage others to use that.

The main difference between Microsoft and the other companies that develop IA and deep learning products is the difficulty that Redmond faces when it cooks them in its consumer oriented line. With Windows 10 Mobile effectively dead, MS has to rely on its Windows market to drive people to Cortana. That's an intrinsically weaker position than Apple or Google, which have huge mobile platforms or Facebook, which has over one billion users. ONNX should benefit all actors working in AI, but it is probably more important for the MS than for other companies with larger user bases. When you have the most popular phone operating system on Earth, you do not have to worry too much about whether another person's neural network models play well with yours.


No comments:

Post a Comment

Note: only a member of this blog may post a comment.