CalHacks 2014

How it all started

This is a tale of great success and disaster, triumph and failure. This is the saga of my experience at CalHacks 2014.

I signed up for the event early this year, and had completely forgotten about it until the day before. I was very overloaded with job interviews, exams, projects, and homework, so I did not plan to attend due to time constraints.

The hackathon started Friday at midnight, so I decided to swing by and check out the sponsors, as well as say hello to my friends. Little did I know that I would get sucked in to a 36 hour coding blitz.

Note: For the impatient, a tl;dr can be found here.

The plot thickens

As I was walking around the Memorial Stadium, I eventually stumbled upon the Intel Mashery table. I had recently gone to IDF14, where I played around with the Galielo. This piqued my interest in hardware, and when I saw that there were Intel Edisons up for grabs I knew I had to try and make something cool. I picked up a box and set up my laptop, then went to walk around and explore some more.

While I was making the rounds I noticed an interesting company called wit.ai, which was demoing control of a computer system via natural phrases. I realized that if I combined the two, I could build a system that allows me to talk to the Intel Edison using natural language phrases, and have it parse them into intents and trigger actions. I picked up a microphone, got an API key, and went to work.

One I assembled all my gear, I sat down and started hacking away. The first thing that I needed to do was figure out how to have the edison flip the voltage on an output pin. I had never done hardware before, so it took a little while to understand how the code worked and what needed to be plugged in where. Initially, I wanted to write in C++ and upload the sketch via the Arduino software, but that would involve inplementing a http server as well. Doable, but not as fun. I decided to set up SSH and configure a NodeJS server on the Edison itself.

First I needed shell access to the edison, which involved connecting both USB cables to my laptop, and using this command: sudo screen /dev/ttyUSB0 115200. Once the screen popped up, I pressed enter and was in. I configured the name of the device, wifi, root auth, and started digging around on the machine. Turns out this runs a tiny distribution called Yocto linux, so apart from a basic shell there's not much else. I tried looking for IPK files for zsh, git, and tmux, but was unable to locate any. Luckily, Steven had some precompiled packages that he could share.

I got my enviroment configured, and SSH'd in. Opened a tmux session, and started configuring NodeJS. In order to listen on port 80, I first had to nuke the default web server, which is located in /usr/lib/edison_config_tools/edison-config-server.js. The last line specified the port, so I just changed it to 8080. Once this was done, I started teaching myself NodeJS and trying to figure out how to write an API. In order to interface with the hardware, I had to install the mraa library.

The idea was to have a node server that would recieve POST requests and trigger actions on the digital outputs. I wired up a LED for diagnostics, and mapped it to GET /light. After a few hours of messing around with the mraa library and teaching myself node, I finally got it to work. That sweet moment of triumph must have looked quite interesting from the side - an LED blinks and a grown hacker starts jumping for joy...

Once this was working, I needed to wire up a relay and start testing out the actual hack. I eventually procured an extension cord with a spliced in relay, and connected it to the Edison.

I now had half my pipleline working - a REST API running on a intel edison, which can flip a relay. The other half of this pipeline involved recieving input from a microphone and turning it into text. The problem is, a USB microphone requires alsamixer and a ton of libraries, none of which are present on the Intel Edison. After many hours of searching the web, reading tutorials, and hacking around, I finally found some IPK files that could potentially work. I ran opkg install alsa* in the folder I created, but alsamixer still didn't work. I tried all possible combinations of commands, scripts, configurations, and other tricks, but nothing I did would allow me to get input from a microphone. After I found myself on some ancient bulgarian forum for the fifth time I realized I was out of options.

I considered switching this over to text, but a home automation system with no voice input is basically useless. It was here that I suddenly remembered the watch on my wrist - not just a regular watch, but the moto 360. Surely there was a way to get voice commands from the watch to a server - it's an Android platform, how hard could it be?

The watch

Android Wear is a relatively closed platform, and considering the young age of the software there is scarce documentation on the topic. I browsed some developer resources, but the list of commands is fixed. Short of creating an app and using a dummy command such as "take a note" or "call me a car" i was out of luck.

After some more digging, I found an xposed module called Google now API, which starting from version 1.4 works with Android Wear. Now, all this module does is allow apps to intercept commands to Google Now. You still need a driver to catch voice and input it into a script. This is where AutoVoice comes in. Once AutoVoice is installed, you can intercept commands and process them in Tasker, which is an Android scripting app.

This is where the magic happens. Tasker allows you to perfom any action the phone can perfom, and in this case, I needed it to perform a POST request to my server. I created the following Task, and started testing.

  1. Add a new profile "AutoVoice Recognized Event behaviour: true", and set the configuration to "Event Behaviour" (first tick) and the command filter to the desired trigger word ("jarvis", in my case).
  2. Have this profile trigger a task HTTP POST to a server (put in the edison IP here) and the data be {"phrase" : "%avcommsnofilter()"}. The content type needs to be set to application/json

Once this was set up, I was ready to test. I spoke to my watch, watched the notification pop up and the POST request shoot to the server. The Edison sent an API call to wit.ai, then got back an intent and switched the relay. Success was mine!

The final morning

I woke up on sunday ready to go and present. Once I arrived at the venue I started setting up the final checks. I had been testing this whole time with the Intel provided WiFi, but it was being slow that day, so I decided to try and use the protable hotspot on my phone. Turns out an Android device acting as a hotspot can't communicate with the clients. I needed another WiFi spot. I tried setting up an AdHoc network on my laptop, but with no ethernet cord there was no network connectivity. This was unacceptable for two reasons - I needed wit.ai access, and my Android phone would refuse to connect to the hotspot in the first place, defaulting to data. My third option was the CalVisitor wifi, but turns out they enabled client isolation, which meant that both the edison and phone could connect, but neither could ping the other. Even ports 80 and 22 were closed!

My final option was a phone hotspot, but using a tablet paired to my watch, with the phoen acting as a hotspot. I now needed a way to transfer all of my data regarding the watch to my tablet. Luckily, both devices are rooted and have Titanium Backup, so I was able to beam my data from the phone to the tablet. I trandferred the Android Wear app, Tasker, AutoVoice, GoogleNowApi, and Xposed. It took a while to configure things, and time was drawing to a close.

Once all of this was finally set up, I needed to test that everything was still working. I fired up the server on the edison, and spoek some commands. What greeted me was not the friendly click of a lamp turning on, but a segmentation fault instead. Host Unknown. Okay, let's ping 8.8.8.8. That works. Odd...

Long story short, after trying three different hostpots, reboots, and even hardcoding the wit.ai IP address, I figured out that my root partition on the edison overflowed. I literally had no space to store the single string needed to resolve a domain name into an IP.

I tried cleaning out random files but by then it was too late. The system was unpredictable and failing, and I had to present in less than half an hour. My most recent two commits were unable to push, so after memorizing my code I had to nuke the Edison.

Recovery

Watching all my work be erased as the new tarball flashed to root was pretty painful, but I was already planning my course of action. The moment the device booted I screen'd in, set up wifi, and installed git and bash. Quickly cloned my repo, then kicked off npm install. Unfortunately, I had forgotten about the mraa library, so after trying to resolve segfaults I finally rememebered to manually install it. At this time the deadline to present hit, and my server was still down.

I frantically coded as fast as possible, bringing my code up to speed with what I had done before. I ironed out some last minute bugs, flipped some debug variables and deployed.

Some friendly hackers helped me carry all my gear down to the floor, where I found my table and set up. There was no power nearby, but luckily I had a spare extension cord and a power brick so I was able to set up the edison, my laptop, phone, tablet, and lamp.

The Result

This is a video of a demo at home, the day after the hackathon. The WiFi signal was much better, and some bugs were fixed. As a result, the latency was on the order of seconds.

Prizes and Shoutouts

Unfortunately, the round 1 judge came by as I was still reflashing the Edison, so I was passed over for the official judging. I feel like I still won, because every time I ran a demo I had a large crowd around me.

As I later found out, the sponsors were handing our personal prizes as well. I ended up winning some neat hardware from Intel Mashery and Wit.Ai, so a huge thanks to them!

What I learned
  1. Be flexible and know how to recover fast. My laptop boot drive failed on the second day, but I had a spare drive, an ubuntu install image ready to deploy, and extra cables and USB's all ready to go.
  2. Always bring extra devices and know how to use them. If I didn't have a tablet I would be sunk - the laptop as a hostpot doesn't work with no ethernet, and I would have no way of demoing.
  3. If a linux system is behaving very oddly, check free space (df -h). You never know who the culprit is.
Challenges
  1. Diagnosing and repairing a laptop during a hackathon is very stressful and time consuming.
  2. Learning NodeJS in a day is intense - there is no room for error, and any mistake can cost valuable time.
  3. Diagnosing the edison DNS issues took up too much time - had I known to check the free space, I would have moved faster.
  4. Getting a decent environment on the edison - installing bash/tmux/git, updating the firmware, configuring wifi.
  5. WiFi issues - the android hotspot and hacker wifi had client isolation on, and adhoc didn't work.
  6. Transferring half the stack of my hack to an alternate device in the span of minutes
  7. Redeploying with 10 minutes until presentation time.

Shout Outs!

Many people had a hand in my success. A big thank goes to the following people: