Deez

joined 1 year ago
[–] [email protected] 5 points 3 weeks ago

Chairs be jumping over Bill

[–] [email protected] 2 points 3 weeks ago (1 children)

I’ve been thinking about getting one. Which model do you have, and how long have you had it?

[–] [email protected] 28 points 3 months ago

Call Me Maybe, such a banger

[–] [email protected] 53 points 3 months ago (3 children)

I assumed it was trying to feast on the goo inside.

[–] [email protected] 43 points 3 months ago (2 children)
[–] [email protected] 1 points 3 months ago

Do you know any one else that has gone through all you have and ended up where you are?

[–] [email protected] 2 points 4 months ago
[–] [email protected] 3 points 5 months ago (1 children)

And when you go through the door, you must know the language to speak (the protocol) or you may be told to leave or ignored.

[–] [email protected] 1 points 5 months ago (1 children)

Thanks very much for that, I really appreciate it! How have you found your DF64?

[–] [email protected] 1 points 5 months ago (3 children)

The vendor (df64coffee.com) say they align the burrs. Would they need further alignment?

[–] [email protected] 1 points 5 months ago (5 children)

Thanks, I’m looking forward to it! But also a little nervous that I won’t be able to tell the difference. 😅

 

cross-posted from: https://lemmit.online/post/225981

This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/janostrowka on 2023-07-19 12:49:02.

Hopefully this will come in handy for our Year of the Voice.

TL;DR: Justin Alvey replaces Google Nest Mini PCB with ESP32 custom PCB which he’s open-sourcing. Shows demo of running LLM voice assistant paired with Beeper to send and receive messages.

Tweet text thread (I would also highly recommend checking out the video demos on Twitter):

I “jailbroke” a Google Nest Mini so that you can run your own LLM’s, agents and voice models. Here’s a demo using it to manage all my messages (with help from @onbeeper) 📷 on, and wait for surprise guest! I thought hard about how to best tackle this and why

After looking into jailbreaking options, I opted to completely replace the PCB. This let’s you use a cheap ($2) but powerful & developer friendly WiFi chip with a highly capable audio framework. This allows a paradigm of multiple cheap edge devices for audio & voice detection…

& offloading large models to a more powerful local device (whether your M2 Mac, PC server w/ GPU or even "tinybox"!) In most cases this device is already trusted with your credentials and data so you don’t have to hand these off to some cloud & data need never leave your home

The custom PCB uses @EspressifSystem's ESP32-S3 I went through 2 revisions from a module to a SoC package with extra flash, simplifying to single-sided SMT (< $10 BOM) All features such as LED’s, capacitive touch, mute switch are working, & even programmable from Arduino (/IDF)

For this demo I used a custom “Maubot” with my @onbeeper credentials (a messaging app which securely bridges your messaging clients using the Matrix protocol & e2e encryption) which runs locally serving an API

I’m then using GPT3.5 (for speed) with function calling to query this

Fro the prompt I added details such as family & friends, current date, notification preferences & a list additional character voices that GPT can respond in. The response is then parsed and sent to @elevenlabsio

I've been experimenting with multiple of these, announcing important messages as they come in, morning briefings, noting down ideas and memos, and browsing agents. I couldn’t resist - here's a playful (unscripted!) video of two talking to each other prompted to be AI’s from "Her

I’m working on open sourcing the PCB design, build instructions, firmware, bot & server code - expect something in the next week or so. If you don't want to source Nest Mini's (or shells from AliExpress) it's still a great dev platform for developing an assistant! Stay tuned!

3
submitted 1 year ago* (last edited 1 year ago) by [email protected] to c/[email protected]
 

I’m not an artist and I created this with AI. I’m not submitting it, but posting it here as possible inspiration to any real artists.

Please forgive any compression artefacts, I had to shrink it due to file size limits on my Lemmy instance.

Edited for clarity.

 

In Home Assistant 2023.7 a feature was added to allow services to provide a response.

This release brings in a change to Home Assistant, which we consider to be one of the biggest game changers of the past years: Services can now respond with data! 🤯

It is such a fundamental change, which will allow for many new use cases and opens the gates for endless possibilities.

In this release the functionality has only been enabled for a couple of services, but I’m having trouble picturing what we can use this for now and in the future.

What are some use cases you can think of on how to use this new feature?

 

Why YSK: It will start working faster if you give it a sec.

This is a bug in iOS Progressive Web Apps. Your scrolling gets locked to an element that is off the screen. Continuing to try to scroll while it’s in this locked state keeps it in the state longer. No interactions for one second will unlock it.

wefwef are tracking the issue here.

 
 
view more: next ›