Giter Site home page Giter Site logo

odincodeshen / hey-dad-listen-to-me Goto Github PK

View Code? Open in Web Editor NEW
0.0 1.0 0.0 25.29 MB

Due to COVID-19 lockdown, it has become the 'new normal' of working from home and taking care of kids. Family members share the same space for working and studying during the day every week. Unfortunately, most parents spend most of their time on the computer screen and lose focus with family at home. To keep the family member connected together, I built up a tiny board with lighting and sound as the kid's commander machine every time they want parents to listen to him/her. I used a couple of open source projects to create interactive equipment for kids to train in theirs prefer way; all of the technology is base on a tiny micro-controller and TensorFlow lite micro-controller. Kids could use interactive gestures or speech to give parents a heads-up to help the family connect again. Once parents received the heads-up, they need to put aside the work on hand and move the sightline into the receiver device. This project gave us a lot of fun time, and we can establish a "focus moment" each time we want. I hope you can enjoy it as well. Hey, Dad. Listen to me. ;)

License: GNU General Public License v3.0

C++ 74.15% C 6.80% CMake 1.25% Makefile 17.79%

hey-dad-listen-to-me's Introduction

Hey Dad, Listen To Me !

Table of Contents

Motivation, It's all about the family connection ๐Ÿ‘ช

Due to COVID-19 situation, a lot of families shift to working from home and long distance learning. Working parents and kids share the same space for working and studying. For parents, it's good to take care that your kids belong with you but sometimes that's very challenging for me to switch work and family mode in the house.

Thanks to the high internet bandwidth at home, you can actually do all of your work at home with people in different time zone.
That means you could sit on your dining table and watch your computer from 7am morning until 9pm for one and one virtual meeting without looking and talking with your family. ๐Ÿ˜ญ

Because of that, I'd like to create an interesting interactivity program with kids so that when they need to talk with parents then they can swipe or speak a keyword to some tiny machine to give parents heads up.

Check this video about the details.

Youtube Video

Project source

This project majorly based on three of open-source projects:

  1. TensorFlow Lite for Microcontrollers (https://www.tensorflow.org/lite/microcontrollers)
  2. Tiny Motion Trainer (https://experiments.withgoogle.com/tiny-motion-trainer) and
  3. Raspberry Pi Pico SDK (https://github.com/raspberrypi/pico-sdk).


Hardware Board:

  1. arduino nano BLE 33 *2:

    • https://store.arduino.cc/usa/nano-33-ble
    • 32-bit ARMยฎ Cortexโ„ข-M4 CPU running at 64 MHz, 1MB of program memory.
    • This CPU should be good enough for TensorFlow Lite for Microcontrollers to detect gestures and speech keyword.
  2. micro:bit v2 *2

    • In the beginning, I plan to use arduino bluetooth low energy (LBE) as the communication protocol. But we need extra work to figure out computer or smartphone APP to control the communication, that's difficult to explain with a 9-year-old kid.
    • I'm thinking to use most easy hardware as the primary communication board, so I decided to use micro:bit v2 (https://microbit.org/new-microbit/). Kids can easily add lighting and sound anything they want on scratch. (https://makecode.microbit.org/)
    • Finally, I use a radio signal to communicate between commander and receiver. on micro:bit also provides a lot of funny multimedia extension board like funny voice speaker, led light strips, and servo motor for kids to bring their creativity.
  3. Raspberry Pi Pico Ardicam *1

  4. micro:bit hardware extension from Seees x Grove (option, you can use any other hardware instead.)

How to Play:

They are two devices:
A. Commander device
B. Receiver device



  1. When a kid wants parents to listen to them, they will swipe the commander device or speech a key-word to the commander device.
  2. Once the device recognizes the gesture or keyword, the commander device will send out a command to the Receiver device.
  3. Then, the parent received a sound and also LED lighting from the dining table. The speaker will buzz every few seconds.
  4. To stop the buzzing, parents need to put aside the work on hand and move the sightline into the receiver device.
  5. The receiver device will calculate the image three times then stop the buzzing. At that time, parents could focus on what the kid what's them talk about.


Have fun and hope you can enjoy ๐Ÿ˜Š


Check this video --> https://www.youtube.com/embed/95lYoM273S4


Machine Learning Training

Gesture define:

When I bring this idea to discuss with the kid, we decide two gestures, time-out and roll.

Time out

Time-out Gesture Example

Roll

Roll Gesture Example

Gesture Training:

Kids and I follow up this Tiny Motion Trainer (https://experiments.withgoogle.com/tiny-motion-trainer) to create two gestures.
Collect 50 samples for "Time-Out"


Then, 50 samples for "Roll"


Start the training on Google Cloud.


Verified the trained model, it looks pretty good on my try. ๐Ÿ˜Š



Finally, download the example code into my local machine to add a communication function.


Person Detection Training:

Original tflite-micro is work well on Arduino Cortex-M4, so I apply the same int8 model into Pico. The result work well.
https://github.com/tensorflow/tflite-micro/tree/main/tensorflow/lite/micro/examples/person_detection

Now, we can start the device deploy.

Device Implementation

Basic on our project spec, there are three of parts need implement.

  • arduino gesture detection.
  • Pi PICO person detection.
  • micro:bit communication and interactive hardware control.

Let's start from arduino.

arduino gesture detection

tiny-motion-trainer has already provided arduino code for download, you can based on the example to design for your project. https://experiments.withgoogle.com/tiny-motion-trainer

Source Here

...
  while (isCapturing) {
    ...
    IMU.readAcceleration(aX, aY, aZ);
    IMU.readGyroscope(gX, gY, gZ);

    // Normalize the IMU data between -1 to 1 and store in the model's
    // input tensor. Accelerometer data ranges between -4 and 4,
    // gyroscope data ranges between -2000 and 2000
    tflInputTensor->data.f[numSamplesRead * 6 + 0] = aX / 4.0;
    tflInputTensor->data.f[numSamplesRead * 6 + 1] = aY / 4.0;
    tflInputTensor->data.f[numSamplesRead * 6 + 2] = aZ / 4.0;
    tflInputTensor->data.f[numSamplesRead * 6 + 3] = gX / 2000.0;
    tflInputTensor->data.f[numSamplesRead * 6 + 4] = gY / 2000.0;
    tflInputTensor->data.f[numSamplesRead * 6 + 5] = gZ / 2000.0;

    if (numSamplesRead == NUM_SAMPLES) {

      // Run inference
      TfLiteStatus invokeStatus = tflInterpreter->Invoke();       
      ...
      for (int i = 0; i < NUM_GESTURES; i++) {
        float _value = tflOutputTensor->data.f[i];
        if(_value > maxValue){
          maxValue = _value;
          maxIndex = i;
        }
      }
      Serial.print("Winner: ");
      Serial.print(GESTURES[maxIndex]);
      
      // GPIO pull-up to trigger micro:bit when maxValue over threshold 

    }
  }
...

Pi PICO person detection.

I follow original arducam guideline to build up my project, mainly on main_functions.cpp.
https://www.arducam.com/raspberry-pi-pico-tensorflow-lite-micro-person-detection-arducam/#person-detection-diagram

Source Here

#include "main_functions.h"
#include <LCD_st7735.h>
#include <hardware/gpio.h>
#include <hardware/irq.h>
#include <hardware/uart.h>
#include <pico/stdio_usb.h>

#include "detection_responder.h"
#include "image_provider.h"
#include "model_settings.h"
#include "person_detect_model_data.h"
#include "tensorflow/lite/micro/micro_error_reporter.h"
#include "tensorflow/lite/micro/micro_interpreter.h"
#include "tensorflow/lite/micro/micro_mutable_op_resolver.h"
#include "tensorflow/lite/schema/schema_generated.h"
#include "tensorflow/lite/version.h"

...
void loop() {
  ...
  if (kTfLiteOk != interpreter->Invoke()) {   
    TF_LITE_REPORT_ERROR(error_reporter, "Invoke failed.");
  }
  TfLiteTensor *output = interpreter->output(0);
  ...
  // Process the inference results.
  int8_t person_score    = output->data.uint8[kPersonIndex];
  int8_t no_person_score = output->data.uint8[kNotAPersonIndex];
  RespondToDetection(error_reporter, person_score, no_person_score);
  
  // GPIO pull-up to trigger micro:bit
}

Communication Code Behavior

Kid and I work this part as well.
We use several hardware extension module to create light and sound to get partner's attention.

ListenMe_Commander

microbit-ListenMe_Commander_1
Source Here

def on_button_pressed_ab():
    radio.send_number(trigger)
input.on_button_pressed(Button.AB, on_button_pressed_ab)

trigger = 0
radio.set_group(88)
trigger = 8
basic.show_string("COMMANDER ")

def on_forever():
    if pins.digital_read_pin(DigitalPin.P2) == 1:
        radio.send_number(trigger)
        basic.show_icon(IconNames.SMALL_HEART)
        soundExpression.giggle.play_until_done()
    elif pins.digital_read_pin(DigitalPin.P1) == 1:
        soundExpression.happy.play_until_done()
        basic.show_icon(IconNames.HEART)
    else:
        basic.show_leds("""
            . . . . .
                        . . . . .
                        . . # . .
                        . . . . .
                        . . . . .
        """)
basic.forever(on_forever)

ListenMe_Receiver

microbit-ListenMe_Receiver_1
Source Here

def on_received_number(receivedNumber):
    global ack_cont, display, command
    if receivedNumber != command and receivedNumber == trigger:
        strip.show_rainbow(1, 300)
        ack_cont = 0
        display = 60
        command = trigger
        basic.pause(100)
    basic.show_number(command)
radio.on_received_number(on_received_number)

def on_button_pressed_a():
    global status
    if status == 0:
        soundExpression.happy.play()
        status = 1
    else:
        status = 0
        strip.show_rainbow(1, 360)
input.on_button_pressed(Button.A, on_button_pressed_a)

def on_button_pressed_ab():
    global command
    command = acked
input.on_button_pressed(Button.AB, on_button_pressed_ab)

def on_button_pressed_b():
    global ack_cont, display, command
    strip.show_rainbow(1, 300)
    ack_cont = 0
    display = 60
    command = trigger
    basic.pause(100)
input.on_button_pressed(Button.B, on_button_pressed_b)

status = 0
display = 0
ack_cont = 0
acked = 0
trigger = 0
command = 0
strip: neopixel.Strip = None
strip = neopixel.create(DigitalPin.P0, 30, NeoPixelMode.RGB)
_4digit = grove.create_display(DigitalPin.P1, DigitalPin.P15)
strip.show_rainbow(1, 360)
radio.set_group(88)
command = 0
trigger = 8
acked = 9

def on_forever():
    global display, ack_cont, command
    if command == trigger:
        basic.show_icon(IconNames.SAD)
        _4digit.show(display)
        music.play_tone(262, music.beat(BeatFraction.QUARTER))
        music.play_tone(262, music.beat(BeatFraction.QUARTER))
        display = display - 1
        strip.show_color(neopixel.colors(NeoPixelColors.RED))
        if pins.digital_read_pin(DigitalPin.P2) == 1:
            ack_cont = ack_cont + 1
            music.play_tone(523, music.beat(BeatFraction.QUARTER))
        if ack_cont >= 3:
            command = acked
        else:
            if ack_cont / 2 == 0:
                basic.show_icon(IconNames.TARGET)
            else:
                basic.show_icon(IconNames.SMALL_DIAMOND)
    elif command == acked:
        soundExpression.happy.play()
        strip.show_rainbow(1, 360)
        basic.show_icon(IconNames.HAPPY)
        command = 0
        _4digit.show(0)
    else:
        strip.rotate(1)
    basic.pause(1000)
basic.forever(on_forever)

All set. After download the code and plug the USB cable, you can have fun with your kid at home now. ๐Ÿ˜

hey-dad-listen-to-me's People

Contributors

odincodeshen avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.