Coral USB Accelerator
A USB accessory that brings machine learning inferencing to existing systems. Works with Raspberry Pi and other Linux systems.
Performs high-speed ML inferencing
The on-board Edge TPU coprocessor is capable of performing 4 trillion operations (tera-operations) per second (TOPS), using 0.5 watts for each TOPS (2 TOPS per watt). For example, it can execute state-of-the-art mobile vision models such as MobileNet v2 at 400 FPS, in a power efficient manner. See more performance benchmarks.
Works with Debian Linux
Connects to any Debian-based Linux system with an included USB 3.0 Type-C cable.
Supports TensorFlow Lite
No need to build models from the ground up. TensorFlow Lite models can be compiled to run on the Edge TPU.
Supports AutoML Vision Edge
Easily build and deploy fast, high-accuracy custom image classification models to your device with AutoML Vision Edge.
- Google Edge TPU ML accelerator coprocessor
- USB 3.0 Type-C socket
- Supports Debian Linux on host CPU
- Models are built using TensorFlow
- Fully supports MobileNet and Inception architectures though custom architectures are possible
- Compatible with Google Cloud
Edge TPU ML accelerator
- ASIC designed by Google that provides high performance ML inferencing for TensorFlow Lite models
Arm 32-bit Cortex-M0+ Microprocessor (MCU)
- Up to 32 MHz max
- 16 KB Flash memory with ECC
- 2 KB RAM
USB 3.1 (gen 1) port and cable (SuperSpeed, 5Gb/s transfer speed)
Included cable is USB Type-C to Type-AU
Raspberry Pi 2/3/4 Model B / B+ only
Also note that to reach the best inference speed, you should use a USB 3.0 port.(unfortunately, Raspberry Pi 2/3 has only USB 2.0 port)