Artificial Intelligence: The Time Is Now
Takeaways from Intel AI Developer Workshop @ Bangalore
I just returned from a full-day developer workshop organized by Intel. The focus was on Artificial Intelligence (AI) and how Intel is contributing in mankind's best efforts to teach machines to sense, reason and decide. The event was a useful peek into what Intel is bringing to the table in terms of both hardware and software. Beyond that, it was not an event that I would call a developer workshop since there were no hands-on sessions, demos or even tips and tricks that developers can use. The event was structured as a line of talks in order to bring awareness of Intel's involvement in AI and where the market is headed.
AI originated in the 1950s but it was only in the 1980s when Machine Learning (ML) came about that people started to think it might be possible to realize AI. Machines can be trained and then asked to solve problems. Their algorithms could be tweaked as they learn and relearn with more sets of data. Deep Learning (DL) came about as a sub-branch of ML, where neural networks became the basis of learning. DL has brought us closer to the dream of realizing AI but DL alone did not achieve this.
What has really been the tipping point is the increasing power of computing. This is due to more powerful chips plus cloud computing that can achieve parallelism like never before. With AI, it's now customary to employ thousands of cores running in parallel to solve a problem using a DL approach. What this means that problems that were previously intractable are suddenly tractable. This could be in genome editing, automated fraud detection, understanding the brain, personalizing education, self-driving cars, crime prevention, or any of the dozens of applications that are likely to arrive in the future.
One speaker compared AI to the telescope. It will be a tool that will open to us new worlds of possibilities. Companies that don't take AI seriously today will be left by the wayside a few years from now. AI might seem like a buzz word today but it's going to be mainstream within a decade (or less). Computers have from the beginning aided us in number crunching. With AI, they're going to encroach upon decision taking. Sure, there will be loss of jobs in tasks that are routine. Just as surely, there will be creation of new jobs higher up the value chain.
IoT is one area that's going to generate lots of data feeding into AI systems. IoT data has unique characteristics: streamed, time-series nature, real-time analysis, correlated, lack of training. Likewise, ML techniques have evolved to handle IoT data: rule based decisions, anomaly detection using One Class Support Vector Machine (OCSVM), Probabilistic Exponential Weighted Moving Average (PEWMA), Markov models, and so on. To process data closer to source and enable real-time decisions, edge computing becomes important. The usual practice is to do the training in the cloud but inferences are made at the edge.
What exactly does Intel offer for IoT? It has a line of processor/SoCs: Curie, Quark, Atom, x86. It has platforms based on these: Galileo, Edison, Joule, Minnow board, D1000, D2000, C1000. Zephyr OS brings a lightweight OS to IoT sensor nodes. One set of tools (compilers, IDEs, libraries) can be used for any platform of choice. Libraries such as mraa.io and UPM make it easy to integrate sensors. I think UPM is worth checking since it provides a nice high-level abstraction from the low-level sensor interfacing. It's only when a specific sensor is not supported in UPM that we need to look under the hood, but this is something that can be done since UPM is open source. Intel IoTDevKit is a quick way to get started in the world of IoT.
Intel's Xeon family of processors already run 97% of the world's ML workload (I'm not sure I got this right: it's sounds unusually large). Xeon Phi offers higher performance. Nervana is the DL framework from Intel. Saffron is to help in reasoning. One speaker mentioned how it was used to find bugs in software. For visual cognition, the vision part comes from Movidius and the depth sensing comes from RealSense technology. For performance, Math Kernel Library (MKL) and Data Analytics Acceleration Library (DAAL) will be useful. Intel also has a worldwide community of developers. In short, Intel is trying hard and serious to make its mark in AI. Let's just say AI is open for business! AI is no longer limited to recognizing cats and dogs, which are the "hello world programs" for AI.
Challenges remain. Using a fixed point ALU at the edge running ML inferences will be critical. For example, Julia is a language suited for AI, easy to learn and offers high performance. But it consumes memory in the order of hundreds of MB and hence can't be used on 16-bit MCUs with memories in the order of KB. Unsupervised learning can be a plus where training data is simply not available. Models have to get better. Currently, AI does not claim to make the best decisions. At best, it does it better than the average human. Can AI teach common sense and intuition? Talking about these aspects raises more questions than answers. So I decided to ask another IEDF member who happened to be sitting next to me. What does AI mean to him? He replied,
Artificial intelligence is faster and better algorithms. After all, a machine is a machine.
I can't marry a machine.
About the Author
Arvind Padmanabhan graduated from the National University of Singapore with a master’s degree in electrical engineering. With fifteen years of experience, he has worked extensively on various wireless technologies including DECT, WCDMA, HSPA, WiMAX and LTE. He is passionate about training and is keen to build an R&D ecosystem and culture in India. He recently published a book on the history of digital technology: http://theinfinitebit.wordpress.com.