Just a quick update on my progress migrating from MotionEye to Frigate for my IP camera motion detection / camera feeds. Setting up Frigate was super easy. I had it processing the camera feeds and detecting people in just a few minutes offloading all of the heavy ML pieces to the Google Coral USB. The bit that I've just spent a couple of hours toying with is changing my Home Assistant Lovelace dashboards over to using the Frigate camera feeds. Setting it all up was easy, installed the Frigate HA integration which created all of the HA devices etc, updated my Lovelace cards to show the camera feeds and it all appeared to work. The bit I'm pulling my hair out over at the moment is that the live camera feeds really lag on my phone and PC (phone connected via WiFi, PC on a wired LAN connection). My gut feel atm is that it's because of the image sizes (2 of my cameras are 2k resolution) so I'm going to see if there's a way to downscale them for the dashboards. I'll
I've not been able to spend too much time on my home automation the last couple of weeks due to work commitments but have just sat down for 30 minutes and added a much needed feature to my lighting setup. Prior to now, I've had most of my lighting triggered on motion all controlled through Node Red flows. This works really well and, depending on time of day, it selects different moods (I use Philips Hue bulbs and sensors but have the motion sensors connected to a Conbee II to get an almost instant reaction time compared to using the Hue API). Anyway, back to the feature. I'll use my office as an example. Most of the time it's using subdued lighting to give effect rather than high light levels. This is great when I'm working or gaming but sometimes I just need the lights to be bright (a recent example was when transplanting my main PC into a new case). My solution is to have a boolean toggle to override the lighting to a separate "bright" mood. The motion d
My Google Coral Edge TPU arrived today which I'm hoping will make person detection on my CCTV fast and simple, and reduce the number of false positives I get currently. Here's a quick unboxing video. And if you're thinking to yourself "What the heck is that??" then here's a little blurb: The Coral USB Accelerator adds an Edge TPU coprocessor to your system. It includes a USB socket you can connect to a host computer to perform accelerated ML inferencing. The on-board Edge TPU is a small ASIC designed by Google that provides high performance ML inferencing with a low power cost. For example, it can execute state-of-the-art mobile vision models such as MobileNet v2 at almost 400 FPS, in a power-efficient manner. In simple terms (that I can almost understand) it takes Machine Learning (ML) tasks and runs them much faster than the HA Blue can do on its own. This means my little HA Blue can do things like detect people and vehicles in a CCTV video feed quickly an
Comments
Post a Comment