<img height="1" width="1" style="display:none;" alt="" src="https://dc.ads.linkedin.com/collect/?pid=12389&amp;fmt=gif">

LATEST BLOG

Intel's Newest Product to Accelerate Deep Neural Networks

Posted by Nadeesha Thewarapperuma

Find me on:

Nov 22, 2016 11:00:00 AM

artix_7_fpga.jpg

The Intel Deep Learning Inference Accelerator includes both traditional CPU chips and Altera FPGA.

Following Intel's acquisition of Altera FPGA, this is the first hardware product that Intel has released.

Machine learning occurs in two stages, allowing neural networks to mimic the human brain.

 

1) Training. This is the first stage where the deep neural network learns a new capability from existing data. The training involves processing large volumes of data.

2) Inference. The second stage applies when the neural network applies what it has learned to a different setting, believing that patterns and inference can be used to complete a similar task.

 

The FPGA provides Intel with a certain degree of wiggle room. The algorithms surrounding machine learning and deep neural networks are constantly changing. By having an FPGA in place, Barry Davis, General Manager of Intel’s Accelerated Workload Group, believes that the hardware can be reprogrammed to allow for software updates and changes.

The product has been designed with very specific end-users in mind. Mainly large data holders. Facebook, Amazon, Google, and Microsoft. Other companies include Alibaba, Baidu, and Tencent.

Jason Miles, Sales Director at EarthTron, works exclusively with both Altera and Xilinx FPGA.

We provide free RFQs. Be sure to check out our index page for a list of the FPGA we have on hand. We are always receiving new FPGA offers and not every offer is reflected on this page. Even if you don't see what you are looking for, be sure to contact us anyway.

Access Our Special FPGA Price List

 

 

Topics: FPGA, Altera, xilinx

Subscribe for Blog Updates!