Trust Google to never fail to surprise its users. It has always known to be one step ahead of the rest. Right from anticipating what the technology market needs next to becoming the biggest search engine, Goggle has proven its worth more than once.
In the latest move by Google, could possibly take the complete course of the many technology markets by the storm. They have come up with a latest cloud computing service that is predicted to be launched in near future.
What is the speciality of this new service? This cloud computing service is meant to provide an exclusive right to use of its ‘Artificial Intelligence chip’. This artificial intelligence chip is the brain child of the engineers of Google.
The news about the new chip and the kind of services it will provide and the kind of impact it will have on the technology market was revealed and discussed at the Google I/O by their CEO Mr Sundar Pichai. The Google I/O is the company’s developer conference that takes place once every year.
What is the new Artificial Intelligence Chip all about?
The key feature of this AI chip is its creation which is second to none. This unique system has been designed such that it is able to not just execute the deep neural networks but in fact also train them.
What are deep neural networks?
The deep neural networks in simple words are the machine learning system. These deep neural networks are the main factor behind the functioning of the company that includes recognition of images and speech. It can also feature automated translations and robotics.
The only criteria that Google has set for this Artificial Intelligence chip is that, it will not be sold directly as a separate entity to the others. But it will possibly be made available taking help from the new cloud service that has been predicted to arrive anytime soon in the future.
This will facilitate any and every kind of business or developer to create as well as operate a software making use of internet. This feature can tap its way through hundreds and thousands of processors of the kind. All of them are available in the data center at Google.
More about the AI chip and Cloud service
This new Artificial Intelligence chip along with the cloud service it is going to be made available with, are both in complete sync with the evolution of Google, which can safely be termed as the most influential company available on the internet.
It has been over 10 years that Google has essentially worked towards driving its online kingdom to higher limits each and every time. This has been possible after developing new data center hardware right from computer servers to network gear and much more.
And with this latest development it has looked towards selling time on this AI chip making use of the cloud service that will be made available soon. This development is nothing that people could have ever imagined, the magnitude of computing power that this will provide can be used to build and operate just about anything you can think of. This includes websites, apps, and other software that is available online.
Will this AI chip be a Revenue Generating source?
As of now the services of Google are free of cost to users. This is because most of its revenues are generated by advertising. But with the development of this cloud computing, it could potentially become one of the biggest sources of revenue for the company. The expectations of revenue from this cloud computing are such that it is expected to be capable of funding the future of the company.
It is the next Step
You can call it the Dubbed TPU 2.0 or even the Cloud TPU, this new chip is said to be a sequel to the custom built processor. This is the processor that has assisted in driving the AI services in the initial stages. This service includes image recognition as well as machine translation tools, and this has been for over two years.
The main difference between the TPU 2.0 and the original TPU is that, the newer one can also be used to train neural networks. While the former was used only to run the system once the training part was completed.
Additionally, the setting of the chip part is available making use of the cloud service that is dedicated for this purpose.
How are the chips trained otherwise ? What are the advantages of the AI chip?
As of now, all the various businesses and developers make use of large farms of GPUs which are chips that have been originally created for providing graphics for various types of games and other software. These GPUs are currently being used to train the neural networks. This will change drastically once the AI chip is brought into execution.
In the current scenario, the Silicon Valley Chip and the nVidia are the companies that are providing the large farms of GPUs and the current leaders in the market. But with Google entering the market, and that too with a well crafted chip that is providing designing that is used to specifically train the neural networks, the other companies will have some stiff competition to face.
The advantages of the TPU 2.0 are that the rate at which it can train the neural networks is many times faster than the conventional training methods with the existing processors. This can save a significant amount of time.
To elaborate this further, if the time taken to complete training the neural networks is about one day, the TPU 2.0 can confidently complete the task within just a few hours.
When you look at other big names like Amazon or Microsoft, they do not need the training to be completed via any other processor as they have their own cloud service to do the task. However, even they do not possess the Artificial Intelligence chip that is being discussed here that could possibly train the neural networks as well as execute them too.
Does this arena have any competitors?
Though it may not seem like the case as of now, one cannot eliminate the possibility of it completely. As in near future Google could have competition from other big names that might potentially enter the market.
There are many companies, some as big as Intel and other being start ups etc. that are all working towards the idea of an AI chip. The idea behind their work is not very different from what Google has. They are working towards developing AI chips that could train and execute neural networks and also provide a perfect alternative to what Google has offered. Though the good part is that, all this development will hopefully move forward. The expectation from these developments is to move the technology market forward and that too at a faster pace.
Is there hindsight to this?
Though Google has come up with this unique Artificial Intelligence chip or the TPU 2.0 which can train as well as execute neural networks, there is no way this can assure success for the company.
This is because when one wishes to make use of the TPU 2.0, they cannot just make use of it directly. The developers will have to learn and understand the way of building and executing the neural networks which is completely new and pretty much unlike what has been done in the past.
The main reason behind this is that TPU 2.0 is not just a new chip that is designed to train and execute neural networks. The designing of this chip is also created especially for TensorFlow, which is software that is used for running neural networks. This was also developed at Google itself.
TensorFlow is a software that is open and available for all to use, however it is not an exclusive option. There are many researchers who are working towards the development of other software that could be potential competition to TensorFlow, they go by the name of Torch and Caffe. The main reason being that every time there is a new chip that is developed, it requires a different type of optimization and this could take months.
Even before the TPU 2.0 was introduced in the market, the bigwigs of Google who worked in the AI chip development raised doubts about if the technology market will actually move towards this newly developed AI chip.
The main reason behind this doubt was the fact that the researchers were already well versed with the tools that would be needed to function with GPUs. There were doubts if the real conversion would actually materialize.
However, notwithstanding the doubts, Google will still provide access to the GPUs with the cloud service. This is because the market for Artificial Intelligence chips is only still developing and with time there will be many different processors that will surface.
Neural Networks is the future
All along we have spoken in detail about how the AI chip can train and execute neural networks. Let us now get an insight about neural networks which form the basis of the future of the technology market.
What are neural networks?
Neural networks are complicated mathematical systems; these networks have the ability to learn complex tasks with the help of analyzing huge amounts of data. To explain this further, the neural networks can learn to recognize a cat, by analyzing numerous photos of a cat. This may sound simple and an obvious thing to do. But if you consider it as a part of technology, it will seem like a very big thing.
Neural network forms the spine of the main functioning of Google which is help choose search results. The way it functions is, when it is provided with a huge data base of words that are commonly spoken, the neural networks can eventually analyze the words and figure out the commands which are given by you to the digital assistant. This in turn helps in choosing the correct search results.
What is benefit of the neural networks?
The basic advantage of the neural network is that is changing the fundamental functioning of the technology including how its hardware is devised. The difference between the neural network and the traditional functioning is that the neural network requires to be trained. Once trained it can function very close to how a human brain would process things.
But for it to function up to that level, it requires immense training. Like to identify a cat, the neural network needs to be provided with millions of images of a cat. And once it has done analyzing, it can then identify what a cat is.
The training required for this level is done by companies and developers who make use of GPUs. These GPUs can sometimes extend to several thousands of them. All of them run inside the enormous data center of a computer. This forms the basis of the functioning of the services of internet all over the world.
However, the flip side of this kind of training is that, when it trains on a conventional CPU processor, all the chips inside the computer servers which drive the online software take up a lot of time. Additionally there is also a whole lot of extra electrical power that is consumed.
Similarly, the CPUs are practically not possible even for the executing of neural networks. To elaborate this, it is not well fitted to make use of the analysis that the neural network has made, meaning when they have learnt how to identify a cat after analyzing millions of cat images and using that information to identify new cats.
The original TPU was designed by Google for the exact same purpose, that is to execute the neural networks or in simple words to make use of all that it has learnt in the training.
So when Google has come up with this chip that not just executes but also trains the neural networks, it has proven to be a big leap in the technology market.
Google says that one of the main reasons that caused them to come up with this idea was the fact that its machine translation models were excessively bulky in size to train. And even when it did, it took way too long for the company to like it.
And so they came up with the TPU 2.0 that has some great features, it extends up to four chips, and can manage about 180 trillion floating point operations per second in other words 180 teraflops.
When the TPU 2.0 is providing this kind of a speed advantage, it is almost certain that the AI chip is here to stay. Though there still will be excessive amount of trial and error involved, the overall look and feel of this system seems promising.