Google Machine Learning projects you might be interested to read about
Published on : Tuesday 04-02-2020
Google announces Coral Accelerator Module, Dev Board Mini for 2020
Coral Accelerator Module & Coral Dev Board Mini

Coral Dev Board Mini
This year’s first new product is the Coral Accelerator Module (pictured above). This multi- chip module includes an Edge TPU ASIC, and exposes both PCIe and USB interfaces. Google notes how it will be “easy to integrate” with custom PCB designs.
The Accelerator Module will also be part of a new smaller, lower-power, and more affordable alternative to the Coral Dev Board. The aptly named Coral Dev Board Mini features a MediaTek 8167s SoC that makes possible 720P video encoding/decoding and computer vision use cases. Both will be available in the first half of 2020. The existing Coral System-on-Module is getting 2 GB and 4 GB LPDDR4 RAM configurations, while the SoM is the base of the Asus Tinker Edge T — a “maker friendly single-board computer that features a rich set of I/O interfaces, multiple camera connectors, programmable LEDs, and color-coded GPIO header.”
LaserTagger
Although first introduced in 2014, the latest iterations of sequence-to-sequence (seq2seq) AI models have strengthened the capability of key text-generating tasks including sentence formation and grammar correction. Google’s LaserTagger, which the company has open-sourced, speeds up the text generation process and reduces the chances of errors Compared to traditional seq2seq methods, LaserTagger computes predictions up to 100 times faster, making it suitable for real-time applications. Furthermore, it can be plugged into an existing technology stack without adding any noticeable latency on the user side because of its high
Reformer
Plugged as an important development of Google’s Transformer — the novel neural network architecture for language understanding — Reformer is intended to handle context windows of up to 1 million words, all on a single AI accelerator using only 16GB of memory.
Google had first mooted the idea of a new transformer model in a research paper in collaboration with UC Berkeley in 2019. The core idea behind this model was self- attention, and the ability to attend to different positions of an input sequence to compute a representation of that sequence — elaborated in one of our articles. Today, Reformer can process whole books concurrently and that too on a single gadget, thereby exhibiting great potential. Google has time and again reiterated its commitment to the development of AI. Seeing it as more profound than “fire or electricity”, it firmly believes that this technology can eliminate many of the constraints we face today.

The company has also delved into research anchored around AI that is spread across a host of sectors, whether it be detecting breast cancer or protecting whales or other endangered species.