Cisco’s $270 million acquisition of AI Enterprise Accompany

On May 2 this year, Cisco announced an agreement to acquire the business intelligence startup Accompany. Cisco will pay $270 million in cash and stock. After the acquisition, the Accompany team will join the Cisco Cooperative Technology Group, which intends to integrate Accompany’s technology. Go to the company’s own collaborative products.

OFweek Optical Communication Network Editor’s Comments: Cisco is a world-renowned network solution provider. The acquisition of Accompany is naturally a concrete manifestation of its enhanced AI capabilities. At present, the AI trend has swept the world. As a world-famous corporate giant, Cisco’s AI is naturally reasonable.

What Do the AI Chips in New Smartphones Actually Do?

Artificial intelligence is coming to your phone. The iPhone X has a Neural Engine as part of its A11 Bionic chip; the Huawei Kiri 970 chip has what’s called a Neural Processing Unit or NPU on it; and the Pixel 2 has a secret AI-powered imaging chip that just got activated. So what exactly are these next-gen chips designed to do?

As mobile chipsets have grown smaller and more sophisticated, they’ve started to take on more jobs and more different kinds of jobs. Case in point, integrated graphics—GPUs now sit alongside CPUs at the heart of high-end smartphones, handling all the heavy lifting for the visuals so the main processor can take a breather or get busy with something else.

The new breed of AI chips are very similar—only this time the designated tasks are recognizing pictures of your pets rather than rendering photo-realistic FPS backgrounds.

What we talk about when we talk about AI

AI, or artificial intelligence, means just that. The scope of the term tends to shift and evolve over time, but broadly speaking it’s anything where a machine can show human-style thought and reasoning.

A person hidden behind a screen operating levers on a mechanical robot is artificial intelligence in the broadest sense—of course today’s AI is way beyond that, but having a programmer code responses into a computer system is just a more advanced version of getting the same end result (a robot that acts like a human).

As for computer science and the smartphones in your pocket, here AI tends to be more narrowly defined. In particular it usually involves machine learning, the ability for a system to learn outside of its original programming, and deep learning, which is a type of machine learning that tries to mimic the human brain with many layers of computation. Those layers are called neural networks, based on the neural networks inside our heads.

So machine learning might be able to spot a spam message in your inbox based on spam it’s seen before, even if the characteristics of the incoming email weren’t originally coded into the filter—it’s learned what spam email is.

Deep learning is very similar, just more advanced and nuanced, and better at certain tasks, especially in computer vision—the “deep” bit means a whole lot more data, more layers, and smarter weighting. The most well-known example is being able to recognize what a dog looks like from a million pictures of dogs.

Plain old machine learning could do the same image recognition task, but it would take longer, need more manual coding, and not be as accurate, especially as the variety of images increased. With the help of today’s superpowered hardware, deep learning (a particular approach to machine learning, remember), is much better at the job.

To put it another way, a machine learning system would have to be told that cats had whiskers to be able to recognize cats. A deep learning system would work out that cats had whiskers on its own.

Bear in mind that an AI expert could write a volume of books on the concepts we’ve just covered in a couple of paragraphs, so we’ve had to simplify it, but those are the basic ideas you need to know.

AI chips on smartphones

As we said at the start, in essence, AI chips are doing exactly what GPU chips do, only for artificial intelligence rather than graphics—offering a separate space where calculations particularly important for machine learning and deep learning can be carried out. As with GPUs and 3D graphics, AI chips give the CPU time to focus on other tasks, and reduces battery draw at the same time. In also means your data is more secure, because less of it has to be sent off to the cloud for processing.

So what does this mean in the real world? It means image recognition and processing could be a lot faster. For instance, Huawei claims that its NPU can perform image recognition on 2,000 pictures every second, which the company also claims is 20 times faster than it would take with a standard CPU.

More specifically, it can perform 1.92 teraflops, or a trillion floating point operations per second, when working with 16-bit floating point numbers. As opposed to integers or whole numbers, floating point numbers—with decimal points—are crucial to the calculations running through the neural networks involved with deep learning.

Apple calls its AI chip, part of the A11 Bionic chip, the Neural Engine. Again, it’s dedicated to machine learning and deep learning tasks—recognizing your face, recognizing your voice, recording animojis, and recognizing what you’re trying to frame in the camera. It can handle some 600 billion operations per second, Apple claims.

App developers can tap into this through Core ML, and easy plug-and-play way of incorporating image recognition and other AI algorithms. Core ML doesn’t require the iPhone X to run, but the Neural Engine handles these types of tasks faster. As with the Huawei chip, the time spend offloading all this data processing to the cloud should be vastly reduced, theoretically improving performance and again lessening the strain on battery life.

And that’s really what these chips are about: Handling the specific types of programming tasks that machine learning, deep learning, and neural networks rely on, on the phone, faster than the CPU or GPU can manage. When Face ID works in a snap, you’ve likely got the Neural Engine to thank.

Is this the future? Will all smartphone inevitably come with dedicated AI chips in future? As the role of artificial intelligence on our handsets grows, the answer is likely yes. Qualcomm chips can already use specific parts of the CPU for specific AI tasks, and separate AI chips is the next step. Right now these chips are only being utilized for a small subsection of tasks, but their importance is going to only grow.

Battery For Asus U305F Series ASUS C31N1411 Laptop
Panasonic Toughpad TM FZ-A1 PANASONIC FZ-VZSU74U Laptop Batteries
BOSE SOUNDLINK Mini I Series BOSE 063404 Replacement Batteries
Lenovo Tablet Smart Phone Lenovo L16D1P32 Tablet PC Batteries
Battery For Apple IPad 2 A1395 A1396 A1397 + Tools Apple A1376 Tablet

WHAT IS AI? HISTORY, DEFINITIONS AND APPLICATIONS

Everyone is talking about artificial intelligence, also known in its abbreviated form, AI. But what is it all about? That’s precisely what we’ll be explaining today.

History

Artificial intelligence is increasingly playing a greater role in our lives, and the latest trend are AI chips and the accompanying smartphone applications. But this technology began to be developed as early as in the 50s with the Dartmouth Summer Research Project on Artificial Intelligence at Dartmouth College in the U.S. Its origins date back even further to the work of Alan Turing—to whom we can attribute the famous Turing test—, Allen Newell and Herbert A. Simon, but AI did not make it into the spotlight on the world stage until the arrival of chess supercomputer Deep Blue by IBM, which was the first machine to defeat the then-defending world chess champion Garry Kasparov in a match in 1996. AI algorithms have been used in data centers and on large computers for many years, but is only more recently present in the realm of consumer electronics.

Definition of artificial intelligence

The definition of artificial intelligence characterizes it as a branch of computer science that deals with automating intelligent behavior. Here’s the hard part: Since you cannot precisely define intelligence per se, artificial intelligence cannot be exactly defined either. Generally speaking, the term is used to describe systems whose objective is to use machines to emulate and simulate human intelligence and the corresponding behavior. This can be accomplished with simple algorithms and pre-defined patterns, but can become far more complex as well.

Various kinds of AI

Symbolic or symbol-manipulating AI works with abstract symbols that are used to represent knowledge. It is the classic AI that pursues the idea that human thinking can be reconstructed on a hierarchical, logical level. Information is processed from above, working with human-readable symbols, abstract connections and logical conclusions.

Neural AI became popular in computer science in the late 80s. Here, knowledge is not represented through symbols, but rather artificial neurons and their connections—sort of like a reconstructed brain. The gathered knowledge is broken down into small pieces—the neurons—and then connected and built into groups. This approach is known as the bottom-up method that works its way from below. Unlike symbolic AI, a neural system must be trained and stimulated so that the neural networks can gather experience and grow, therefore accumulating greater knowledge.

Neural networks are organized into layers that are connected to each other via simulated lines. The uppermost layer is the input layer, which works like a sensor that accepts the information to be processed and passes it on below. This is now followed by at least two—or more than twenty in large systems—layers that are hierarchically above each other and that send and classify information via the connections. At the very bottom is the output layer, which generally has the least number of artificial neurons. It provides the calculated data in a machine-readable form, i.e. “picture of a dog during the day with a red car.”

Methods and tools

There are various tools and methods for applying artificial intelligence to real-world scenarios, some of which can be used in parallel.

The foundation of all this is machine learning, which is defined as a system that builds up knowledge from experience. This process gives the system the ability to detect patterns and laws—and with ever-increasing speed and accuracy. In machine learning, both symbolic and neural AI is used.

Deep learning is a subtype of machine learning that is becoming ever more important. Only neural AI, i.e. neural networks are used in this case. Deep learning is the foundation for most current AI applications. Thanks to the possibility of increasingly expanding the design of the neural networks and making them more complex and powerful with new layers, deep learning is easily scalable and adaptable to many applications.

There are three learning processes for training neural networks: supervised, non-supervised and reinforcement learning, providing many different ways to regulate how an input becomes the desired output. While target values and parameters are specified from the outside in supervised learning, in unsupervised learning, the system attempts to identify patterns in the input that have an identifiable structure and can be reproduced. In reinforcement learning, the machine also works independently, but is rewarded or punished depending on the success or failure.

Applications

Artificial intelligence is already being used in many areas, but by no means are all of them visible at first glance. Therefore, selecting scenarios that take advantage of the possibilities of this technology is by no means a completed list.

Artificial intelligence’s mechanisms are excellent for detecting, identifying, and classifying objects and persons on pictures and videos. To that end, simple but CPU-intensive pattern detection is used. If the image information is decrypted and machine-readable in the first place, photos and videos can be easily divided into categories, searched and found. Such recognition is also possible for audio data.

Customer service is increasingly using chatbots. These text-based assistants perform recognition using key words that the customer may tell it and they respond accordingly. Depending on the use, this assistant can be more or less complex.

Opinion analysis is not only used for forecasting elections in politics, but also in marketing and many other areas. Opinion mining, also known as sentiment analysis, is used to scour the internet for opinion and emotional expressions, allowing for the creation of a largely anonymized opinion survey.

Search algorithms like Google’s are naturally top secret. The way in which search results are calculated, measured and outputted are largely determined by mechanisms that work with machine learning.

Word processing, or checking the grammar and spelling of a text, is a classic application of symbolic AI that has been used for a long time. Language is defined as a complex network of rules and instructions that analyzes blocks of text in a sentence and, under some circumstances, can identify and correct errors.

These abilities are also used in synthesizing speech, which is currently the talk of the town with assistant systems like Siri, Cortana, Alexa or Google Assistant.

On new smartphone chips like the Kirin 970, artificial intelligence is integrated into its own component, the NPU or neural processing unit.The processor is making its debut in the Huawei Mate 10. You will learn more about it and the roles that the technology will play on the Huawei smartphone once we have a chance to experiment with it in the near future. Qualcomm has already been working on an NPU, the Zeroth processor, for two years, and the new Apple A11 chip contains a similar component.

Furthermore, there are numerous research projects on artificial intelligence and the most prominent of all may be IBM’s Watson. The computer program had already made its first public debut in 2011 on the quiz show Jeopardy, where it faced off against two human candidates. Watson won, of course, and additional publicity appearances took place afterwards. A Japanese insurance company has been using Watson since January to check insured customers, their history and medical data and to evaluate injuries and illnesses. According to the company’s information, Watson has replaced roughly 30 employees. Loss of jobs through automation is just one of the ethical and social issues surrounding AI that is the subject of corporate and academic research.