Somewhat that I'm simply taken by the ultra-slick achievement that’s been accomplished by the engineers who made the smart bird feeder, along with other ingenious examples.
I had been a “taker” of Google’s AIY (Artificial Intelligence Yourself) product line, in purchasing their second edition of the Voice Kit, a Google Cardboard kit, aimed at users age 14 and up, and I found it to be a somewhat very entertaining and challenging (hardware accompaniment-wise, as the board used only micro-usb ports and configuring an SSH and VNC remote-device connection [which was my alternative mode of solving the problem] was still a bit confusing, largely; particularly in establishing a clear-cut routine).
Regardless, I was still able to get the small System On Module board to be built, configured, and work with SSH and VNC connections, but I never got around to really getting in to the meat-and potatoes of the device - loading the Google developer repository and full library, language and open source coding behind the gist of the machine could have been - a development module for the Google Assistant with the gRPC source cod libraries loaded, in various language flavors and offerings - I was fixated on trying to get the Java libraries to work, on a note of that I had been reared on Mac OS X, largely, which has Java .jar files as one of their natively supported coding languages and package support features built in to the operating system. I was imagining that I could develop .jar modular instances of pop-up applications of an ad-hoc nature, upon having the applications being invoked by the user. It was a fairly invested many nights of research that I had devoted to the process.
Google's more-advanced (and cutting edge, high-powered) System On Module device and plug-and-play System On Module (bigger-brother) of the AIY retail offerings, which I bought from Target. The Voice Kit is currently available on other online retailers for as little as $5, since the board is a Zero wH Razpberry Pi board, with a single-threaded processor, whereas the Raspberry Pi boards were at the 3rd generation at the time of release of the Voice Kit, and the Raspberry Pi 4 had subsequently been released.
I'd procured a new credit card, so I'm a bit fawning over potential purchases and hardware investments. The Coral ecosystem of development seems really exciting, and the specs on the machine are mostly foreign to me, even as a hardware-specs aficionado (apparently not quite enough to remain up-to-date, though).
The basis of the custom Linux operating system is based on TensorFlow, so it's a visualization-prominence-centric workflow and set of options, in the Machine Learning ecology of development languages and coding environment. There are ultra-cutting edge modular-use implementation devices, such as miniature-cameras with unheard-of specs and special features lending themselves to industry-speciric uses, such as can be found on edmundoptics.com. It's exciting to imagine the projects I'd be capable of setting myself out about, although I'm more focused on the audio-vibratory (ultra-low frequency and plurisubharmonic outer-extents of beyond-audible, as well as audible frequencies), rather than visual-spectra of the wavelength frequency spectrum.
High-end camera hardware that can be used in Coral.ai (with Google) project implementations. |
It takes making the first step, in development, in order to further the horizon of possibilities; so to speak. Check out the links on their hardware offerings with this one, up at the top of the article, though - the advances of hardware in the miniatrization package of the System-On-Module microcomputer form factor are really exciting projects to delve in to.
No comments:
Post a Comment