Intel and Facebook had earlier stated they would work together to build a new class of chip for AI applications The chief artificial intelligence (AI) researcher of Facebook has revealed that the technology giant is working to build a new class of semiconductor, which is expected to work in a unique way as compared to most of the existing designs. Reportedly, Yann LeCun has mentioned that the future chips utilized in the training of deep learning algorithms, that support most of the latest progress in AI, are needed to be capable of manipulating data without requiring to break it up into numerous batches. A lot of the existing computer chips supposedly divide the data into chunks and process every batch in sequence, due to the need for handling the vast amount of data required for these machine learning systems to learn. Mr. LeCun said in an interview, before the release of his research paper he wrote on the history and future of computer hardware designed for handling AI, that the company does not want to leave any stone unturned, mainly when no other company is turning them over. Sources familiar with the matter informed that, Facebook and Intel have earlier said both the companies are working together on a chip of new type specifically designed for AI applications. Intel said, in January, that the company planned on making the new chip ready by second half of this year. Mr. LeCun added that graphic processing units (GPUs) would stay vital for deep learning research at the moment. However, the chips were not suitable to run AI algorithms once they were trained, on home digital assistants or devices like mobile phones, or even in datacenters. Future AI chip designs would need to handle information more efficiently, sources commented. Most neurons in a system, like the human brain, do not need to be activated. However, current chips process information from all the neurons in the network at every step of a computation, even if they are not used, which lowers the efficiency of the process, Mr. LeCun added.