Facebook is developing its own line of machine learning AI chips to handle the workloads associated with its social media platforms.
According to The Information, one chip handles machine learning tasks such as content recommendation to users, while another handles video transcoding.
Over a hundred people are believed to be involved in the machine-learning chip projects.
The move should come as no surprise; in 2018, Facebook hired Shahriar Rabii, the founder of Google’s hardware technology engineering team and consumer silicon team, as its new Head of Silicon.
Additionally, the initiative echoes similar efforts by other hyperscalers to leverage their scale and understanding of their own workloads to develop more efficient chips and reduce their reliance on companies such as Intel, AMD, and Nvidia.
Google announced earlier this year that it had developed a custom ‘Argos’ video-transcoding chip that would provide “up to a 20-33x increase in compute efficiency” over its previous traditional server setup.
Additionally, Google developed its own TPU AI processor family, Titan security chip line, and hired Intel’s Uri Frank to build a System on Chip this year.
Meanwhile, Amazon Web Services has its own Arm CPU line called Graviton, as well as chips for AI training (Trainium) and inference (Inferentia). Microsoft is also believed to be developing its own Arm processors.