What do I want to watch tonight? What is the next book I might want to read? Those types of questions are usually answered through the suggestions that are embedded into your screen when you’re about to make your selections either on your phone, computer, tablet, or television.
Initially, or as I recall, these recommendations were not always as accurate or what you would expect. That is until Amazon revolutionized the e-Commerce industry through Deep Scalable Sparse Tensor Network Engine (DSSTNE), an open-source product recommendation engine that is created with neural network algorithm.
The Future of Big Data
With some guidance, you can craft a data platform that is right for your organization’s needs and gets the most return from your data capital.
So, what is so special about Amazon’s DSSTNE in addition to leveraging sparse Neural Networks as the recommendation engine?
- It utilizes C library to run a deep learning neural network algorithm against large sparse input vectors on GPU hardware.
- The DSSTNE has sparse input and output layers with fully connected in between layers.
- It is a wide neural network which has more neurons in the input layers than a regular deep neural network.
- The training and prediction both scale out to use multiple GPUs, spreading out computation and storage in a model-parallel fashion for each layer.
- The model-parallel scaling enables larger networks that are possible with a single GPU.
- The DSSTNE is optimized for fast performance on sparse data sets, common in recommendation problems. Custom GPU kernels perform sparse computation on the GPU, without filling in lots of zeroes.
Even though there are other methods available that can be applied to provide a decent recommendation. However, Amazon uses DSSTNE on billions of products and extremely long sparse vectors, where the traditional methods are incompetent. The next time you want to build the recommendation engine, take a look at DSSTNE, it should not take a long time to implement, and rewards are substantial.
You can find about DSSTE here and here.
Happy data crunching!