demo robot designed and built by AACC students.
So just to give you some background. The python program controls a demonstration robot based on K2SO. I have it built from the waist up and it is set on a piece of plywood that will sit in the hall of our local community college. His neck can pan to the left and right. There are limit switches. All motion control is controlled by an arduino. This program seems to work very well. The main python program sends serial commands to the arduino to control the neck. The robot has mouth LEDS, EYE LEDS and body LEDS that are all controlled by the arduino and can be turned on and off by the serial commands. The head has a USB camera with detects faces and tracks in the pan direction.
The flow of the main program is a person presses the green button to initiate everythng. The neck does a pan home to calibrate the neck. Once the neck is calibrated – the camera and tracking part of the code turn on. The program then plays a random greeting MP3 – while this MP3 is being played the ‘mouth LEDS’ are flashing away. Once the MP3 is done , the Mouth LEDS stop. The MP3 asks the user if they have a question. I have a function that converts the voice into text. This text is passed to the CHATGPT API which reply’s a text answer. I’m using a text to speech library to generate the voice reply and the program replies to the user. While all this is going on – the face tracking continues. Once the reply is over – a 10 seconds countdown starts with the program still tracking. Once the countdown is over – it goes back to the beginning waiting the for green button to be pressed.