Some projects are built to solve real problems. Some are built just because they’re funny. Mean Clock is a completely unnecessary invention that firmly belongs in the second category. It’s a clock that detects when you’re staring at it and starts screaming obscenities at you.
The Idea
This all started because I went on a Michael Reeves binge—which, if you know anything about Michael Reeves, you know is a very dangerous thing. One of Michael’s greatest hits, the inspiration for this project, was Trigger Me Elmo. He basically hacked an elmo toy to detect the user’s race and hurl stereotypical insults at the user. It’s an amazing video and its a lot less offensive than it sounds. Anyway, I saw this video and the idea just struck me like lightning. I needed a clock that verbally abused people for checking the time.
The Build
First, I needed a clock. I found a clock on Amazon that looked like two eyeballs—one for the hour, one for the minutes. It is clearly a clock, and it is hard to tell the time by looking at it at first glance, so this was perfect. Also, it was cheap, only $13.94.
I also bought speakers, a Raspberry Pi 4B (mistake), a Pi camera module, and an SD card. The plan was simple (or so I thought): give it a mouth and eyes (well a nose… you’ll see, it’ll make sense), and give it a brain. I get to invoke one of my favorite memes again:
Step 1: Giving the Clock a “Nose” (Well, technically eyes)
I needed to mount a Raspberry Pi camera module into it. The positioning of the camera was important to me because I wanted it to still slightly resemble a face. I drilled the hole slightly below center, right where a nose would be if this thing had a face. The positioning makes it look like how the nose might look on a crudely drawn smiley face; but really, this is where the eyes of the clock would be. Nose eyes.
Mounting the camera was an absolute pain in the ass and easily one of the worst parts of the entire project. I couldn’t get my hands inside the clock, so I had to use chopsticks to position it properly. It took way longer than it should have. I later realized I should have just cut the back off the clock instead of drilling through it like a moron.
The chopstick method worked (barely). The camera was in place and connected to the Raspberry Pi. The clock could officially see the world; but it still can’t make any sense of it.
Step 2: Making It “See” You
Obviously, the hardest part was getting the Raspberry Pi to detect when someone was staring at the clock. Originally, I wanted to use a neural network for gaze detection. I wasted a lot of time here and was one of the key learned lessons from the project.
- The Raspberry Pi 4B couldn’t handle AI models well. It lagged like crazy, had storage limitations, and just wasn’t built for it.
- I tried converting models to TensorFlow Lite—still didn’t work and honestly made some of the issues worse. Problems having to do with the libraries that leverage the GPU of the Raspberry Pi started popping up.
- I wasted way too much time trying to force a neural network based solution, when I simply didn’t have the hardware to support it.
So I pivoted to a simple python face recognition package that didn’t require heavy AI processing. I then basically hard-coded values to check:
- Where your pupils were positioned.
- If they stayed there for a certain amount of time.
If both were true, boom—you’re staring at the clock, and it’s about to yell at you.
There was a ton of trial and error here. This was not perfect by any means, and I spent a lot of time trying to figure out the right amount of variability to allow in eye detection to trigger a stare. I had it output eye positioning values and then would look north, south, east, west, and directly at the camera. Then I took those values and tests and updated the code to basically trigger a stare after a certain amount of time when eyes are in the direct position.
My estimate is that it worked about 60% of the time. Sometimes it wouldn’t trigger at all. Other times (when live) it would just trigger when you were in its field of view but not looking at it.
60% though? That was good enough for a project this dumb; so I moved on.
Step 3: Giving It a Voice
Now that the clock could tell when someone was looking at it, I needed to give it a voice and things to say.
I could have just pre-recorded insults, but that wasn’t funny or cool enough. I wanted to crowdsource my insults, so I built a website (meanclock.jbigs.com) where anyone could submit custom insults.
Here’s how it worked:
- Pick a celebrity voice (Taylor Swift, Gilbert Gottfried, Snoop Dogg, Joe Biden, Kermit the Frog, etc.).
- Type in an obscenity or insult of your choice.
- Submit it.
The website would then:
- Send the text to Neets.ai (RIP), which was a text-to-speech API for celebrity voices that had a generous free tier.
- Store the generated audio in a Google Cloud bucket.
- The Raspberry Pi would download new insults every few minutes and store them locally.
Now, every time someone stared at the clock, it would shout completely custom insults generated by my friends, the internet, and whoever else got their hands on the link.
No safeguards for edgy content; this thing was truly funny.
The Aftermath: A Functional Disaster
Mean Clock worked—but it was a menace.
- It looked like a bomb. Wires everywhere. A weird glowing light. Suspicious as hell, definitely could not travel with it.
- It false triggered constantly. I placed it in the living room, in the corner, so it had a view of people watching TV. Occasionally, you’d be locked in on a great episode when all the sudden at full volume you’d hear Gilbert Gottfried (rip) scream “DO YOU HAVE A STARING PROBLEM ASSHOLE????”
- My fiancée hated it, and rightfully so. This thing was annoying and I spent way too much time on it.
And yet, for a glorious moment in time, the clock fulfilled its purpose—it yelled at people for looking at it. And that’s all I ever wanted.
I shelved the project when I got bored of it and needed to use my raspberry pi for a new project. The Mean Clock lives on in spirit and in the hearts of people who knew it. Every time I catch a glimpse at a clock to check the time, I hear Mike Tyson yell “You got a staring problem or do you just wanna kith me?“.
Rebuilding the Lost Code Using AI
I added in this part cause I thought it was pretty neat. In preparation of writing this post, I was looking for the original files. Like an idiot, I lost the original Python files that powered Mean Clock when I reset the SD card to troubleshoot a new project. But luckily, I had made a YouTube Short (see below) showcasing the project, and in that video, I had a screen recording of my code.
So, I:
- Went frame by frame, screenshotting snippets of code from the video.
- Uploaded those images to Google AI Studio, and asked it to reconstruct the python script from the images.
- Modified it to work on Windows, since I wanted it to work but was too lazy to test it in a Linux environment or the actual Raspberry Pi.
Want to See It?
I put together a Git repository with the website files and the rebuilt stare detecting python script. Additionally, here is the YouTube Short I created showcasing the project (before I had the blog):
What I Learned
- AI isn’t always the best solution. Especially in situations where you don’t have the resources to properly run it. Other solutions exist that can emulate what you’re trying to do. Even though I was caught up in the AI hype wave and wanted to originally use it in this project, it just clearly wasn’t the best fit.
- It’s okay to lower your expectations, especially for a project like this. 60% wasn’t the original goal, but it got the job done, and that is all that mattered.
- Chopsticks are a valid engineering tool.
- Overengineering can be fun. Its not a great habit to get into, but overengineering this project made it a lot more fun to work on.
Would I Ever Revive It?
The Mean Clock sits on my desk today as a trophy, and its life is over. Although, I have always wanted to build a smart mirror… maybe with a feature that uses AI to insult you on what you’re wearing or how you look…?