So far, humans have had the edge in the ability to identify objects by touch. but not for long. Using Google’s Project Soli, a miniature radar that detects the subtlest of gesture inputs, the [St. Andrews Computer Human Interaction group (SACHI)] at the University of St. Andrews have developed a new platform, named RadarCat, that uses the chip to identify materials, as if by touch.
Realizing that different materials return unique radar signals to the chip, the [SACHI] team combined it with their recognition software and machine learning processes that enables RadarCat to identify a range of materials with accuracy in real time! It can also display additional information about the object, such as nutritional information in the case of food, or product information for consumer electronics. The video displays how RadarCat has already learned an impressive range of materials, and even specific body parts. Can Skynet be far behind?
Again, this could provide robots with a sense of touch, in a manner of speaking, that rivals our own human capacity for object recognition. This has applications for a wide range of robots: industrial machines will be able to recognize the material composition of an object that may require more force to lift while wearable versions would assist humans with disabilities. This is a technology worth keeping an eye on.
While this tech is a ways away from widespread use, you can still turn anything into a touch sensor today with Touché.
[via /r/linux]
Filed under: hardware, Software Development
// from Hackaday http://ift.tt/2etR8jp
site=blogger">IFTTT
EmoticonEmoticon