Just a quick update on my progress migrating from MotionEye to Frigate for my IP camera motion detection / camera feeds. Setting up Frigate was super easy. I had it processing the camera feeds and detecting people in just a few minutes offloading all of the heavy ML pieces to the Google Coral USB. The bit that I've just spent a couple of hours toying with is changing my Home Assistant Lovelace dashboards over to using the Frigate camera feeds. Setting it all up was easy, installed the Frigate HA integration which created all of the HA devices etc, updated my Lovelace cards to show the camera feeds and it all appeared to work. The bit I'm pulling my hair out over at the moment is that the live camera feeds really lag on my phone and PC (phone connected via WiFi, PC on a wired LAN connection). My gut feel atm is that it's because of the image sizes (2 of my cameras are 2k resolution) so I'm going to see if there's a way to downscale them for the dashboards. I'll ...
I've recently moved from fixed trigger times for certain (mostly lighting) events to user-configurable. This provides me an easy way to configure via a Home Assistant dashboard. This is achieved by the use of some helpers and a simple node-red subflow: The subflow code can be downloaded here for import into your node-red instance. The subflow outputs one of the following values: Morning Day Evening Night Unknown (if no matching timeframe is found) Based on matching the helper values in a cascade of time_range nodes in the subflow. To use it you'll need to create "time" helpers with the following entity ids: input_datetime.morning_start input_datetime.morning_end input_datetime.day_start input_datetime.day_end input_datetime.evening_start input_datetime.evening_end input_datetime.night_start input_datetime.night_end which can then be added to a dashboard for easy editing.
My Google Coral Edge TPU arrived today which I'm hoping will make person detection on my CCTV fast and simple, and reduce the number of false positives I get currently. Here's a quick unboxing video. And if you're thinking to yourself "What the heck is that??" then here's a little blurb: The Coral USB Accelerator adds an Edge TPU coprocessor to your system. It includes a USB socket you can connect to a host computer to perform accelerated ML inferencing. The on-board Edge TPU is a small ASIC designed by Google that provides high performance ML inferencing with a low power cost. For example, it can execute state-of-the-art mobile vision models such as MobileNet v2 at almost 400 FPS, in a power-efficient manner. In simple terms (that I can almost understand) it takes Machine Learning (ML) tasks and runs them much faster than the HA Blue can do on its own. This means my little HA Blue can do things like detect people and vehicles in a CCTV video feed quickly an...
Comments
Post a Comment