Tag: "android"

Yahoo! Open Hack Day Brasil 2010

On March 20th and 21st, Yahoo! Brazil brought us our second Open Hack Day. I'd been to the previous one, in 2008, and it was amazing! Our project even won at a newly-created category, aptly named "What the Hack?"

This year, I wanted once again to try a hardware hack, using whichever parts I could get my hands on. Not necessarily anything useful, though. That's what I love about the Hack Day. I can do useful stuff throughout the rest of the year :)

Image © brhackday, used with permission

Yahoo! Hack Days

Before the 2008 Hack Day, I had pretty much written off Yahoo! as a company that was no more. I wasn't even particularly interested on the event, and only decided to go at the last minute. Boy, what a change in perspective. Of course, Yahoo!'s São Paulo team has some very clever people. But, more generally, I was very impressed with the data-gathering tools that Yahoo! had started offering. YQL is simply fantastic. It perfectly captures what the Internet is about, data-wise (I'm not saying it's perfect, but it embodies the right spirit). I hadn't had that much geek fun in a long time. So you can probably tell my expectations were high for this year's edition. Could Yahoo! deliver?

Not to fret, they knew what they were doing. Just put some 250 hackers in a fish bowl, give them food, coffee, wifi (surprisingly good, some silly proxy restrictions excluded), show them Monty Python, and wait for it!

Our hack, and our hackers

Image © brhackday, used with permission

Even before the announcement of this year's Hack Day, I'd been toying with the (admittedly silly) idea of a firefighting robot, that lurked online waiting for people to report fires anywhere in the world. It would then bravely roll over to wherever the fire was, and put it out. Bravely.

Trouble is, it'd probably have to be one gargantuan robot. So I thought I'd settle for a more modest, Lego-built, Arduino-controlled one. With the Hack Day approaching, I suggested this project to a few friends and we created a Wave to discuss the idea. I was planning to use a box of old Lego pieces from my childhood (see mom? I told you I'd have eventually use those again!) and a couple of servo motors I had lying around, but Rodolpho brought his Mindstorms NXT into the picture, making the project much cooler (and, incidentally, much more manoeuvrable).

On Saturday morning I rode with Gola to the (really cool) auditorium of Senac University, where the previous Hack Day had already been hosted. There we met Rodolpho and Mobi, our original team. We had found out the day before that there would be a limit of 4 people on each team, this year. In 2008 our team had been comprised of... 12? 15? I never even knew. We'd simply started building weird, blinking stuff, and people had gathered around for the fun. We got, therefore, a bit disappointed with this edition's limit. But, since we weren't really expecting to win anything (and therefore disqualification wasn't an issue), we bent the rules a bit, and Werneck, Lucmult and Mauro hacked with us. Also, Aline joined us a bit later. Since Werneck and Lucmult eventually had to leave and didn 't return for Sunday, and Aline and Mobi didn't program, I think we were sort of in the clear. Technically. Sort of.

And then we started building.

What we built

The robot, in construction. Photo by alickel

The body of the robot ended up using mainly the Mindstorms parts. We had started building a larger, sturdier body with assorted Lego pieces on top of a rigid board, pulled by front-wheel drive with four wheels in the front and a loose trailing one. But, after a lot of testing, the loose wheel kept veering the robot out of track, so we rebuilt everything with a lighter, smaller frame, and a pair of caterpillars tracks. In hindsight, I think we should have kept the larger body (even if rethinking wheel traction), as it gave the robot more stability, which we would come to miss later. But we had to make a quick decision, and we worked with what we had. The NXT controller sat on the robot and communicated with an external server via bluetooth, relaying control input to the wheels and sending back odometry information.

This external server (actually, Rodolpho's notebook) continuously used YQL to query Twitter, Yahoo! Meme and news feeds (which I originally wanted to aggregate using Yahoo! Pipes, but we didn't have time for it), searching for people reporting fires - a simple string search, filtered by the Yahoo! Term Extraction API. Whenever there were reports, the server would send them through the Placemaker API to extract location information. It then determined which location in the world was in most urgent need of aid (by number of reports).

Now, the tricky part: the robot needed to be aware of a rectangular projection of the world onto the room, and to know its own location on it. I don't mean a visual projection. We did consider it, but realised that it would be unfeasible for the Hack Day. We toyed with the idea of printing (or drawing) a world map on large sheets and taping then to the ground, but the robot would surely slip, trip, or tear the sheets. So we settled for an abstract projection, and, based on information sent back from the robot, plotted its estimated current position in the "world", at each instant, using Yahoo! Maps and the Yahoo! Geocoding API (via geopy).

To locate itself, the robot relied on Mindstorms' odometry feedback, and on QR Code markers distributed around the room, read by an Android phone using Python and the Barcode Scanner app. The Android phone sent the information encoded in each QR Code to a custom service built with Python's SimpleHTTPServer, which our server polled. The server then sent back new movement controls, according to the robot's current position and desired target. We tried to use Mindstorms' sonar and colour sensors, but they just wouldn't work reliably with the Python interface (which we needed to send commands programatically via bluetooth).

A happy robot. Image © codepo8 CC BY 2.0

No one likes an impersonal robot, and ours, accordingly, had a face in order to convey emotions. We used an XO laptop as the head, and defined that the robot would have pre-determined emotions depending on the situation: it would be "at rest" when there were no fires going on, "worried" when it was going towards a fire, and "happy" when it had put the fire out. A Python script running in the background continuously searched Flickr (using YQL) for expressions associated with each of these emotions, and downloaded a number of related pictures. Another script queried the robot's current emotional state (set by the server), and displayed an appropriate random subset of these pictures. For every few of them, we'd display a face that Aline had drawn specifically for that emotion, to make up for the fact that we couldn't be sure if the pictures would depict it (people give the weirdest tags and descriptions to their Flickr uploads...).

We originally meant to use an Arduino to control a servo that would squish some water from a syringe, thus putting out the fire once the robot got to its destination. But all that proved too much for a mere 24h, and we had to settle for a manual pump.

A camera-shy robot, and a sleepy presenter

Of course, we'd tested (almost) everything, and the robot worked (almost) perfectly. (Almost) Great. But, when it was time to go up on the stage, we hit a few snags.

First, there was only one large screen. Of course, we'd known that, but it hadn't occurred to us that people would need to see not only the robot (which was tiny, especially at a distance), but also the map showing its movement towards the fire, at several different moments. Yahoo! had predicted even such an eventuality (seriously, guys, kudos!) and provided us with a way to switch between our server's screen and the feed from a video camera which Aline used to film the robot. But I didn't manage to coordinate the switch properly, and we didn't get to show the animated map.

Also, we had already noticed that the XO weighed a lot compared to the rest of the robot (remember what I said about the previous, sturdier version?), and that, in order to steer, we needed to balance it carefully. I didn't, and the robot soon went out of its path.

Finally, we had used so many different technologies and APIs on this project, and most of them simply vanished from my mind when I started presenting! Note to self: next time, write some hints on the back of my hand...

All I have to say for myself is that I had slept for only about 1 hour since the previous morning, ours was one of the last projects to be presented, and by then I was very sleepy and barely able to react. I should have switched from the camera to the map more often, so people could see what was going on. I should have grabbed the robot when it went off track and restarted the demo. I should have asked my teammates for help when trying to list everything at work on the robot. Shame on me.

But, all in all, I think it was a great project, and I'm very proud of it. I'm especially proud of my team, you rock much more than I was able to show :)

And, next year, I promise to take a nap before presenting!

The Robot. Image © brhackday, used with permission

comments Comments
 

Pushing up Python on Android

A few days ago, I put on the manliest voice I could muster and made an announcement to my wife: "Stand aside, woman! I am going to the gym!"

Needless to say, she was thoroughly unimpressed but somewhat amused when, half an hour later, she found me at the computer, programming.

Despite what she might tell you, I hadn't given up on exercising. You see, I had recently taken up the One Hundred Pushups, Two Hundred Situps and Two Hundred Squats programs. These involve a few sets of a varying number of repetitions each, with timed pauses between each one. So, for instance, on my first day I'd do 10 pushups, rest for 60 seconds, do 12 pushups, rest, 7 pushups, rest, 7 pushups, rest, and finally as many pushups as I can (but at least 9). These numbers of repetitions vary as you progress. My problem was keeping track of how many repetitions of which exercise to perform on any given day.

Now, there are nice PDFs with the whole exercise program on each site, but they're supposed to be printed. On paper. How low tech! Some people use spreadsheets, but... Meh. So I decided I should turn to my Android phone for help.

There is an iPhone app for One Hundred Pushups et al, and at first I considered writing an Android app to match. And I still do, but, of course, that's a full-on project, one that would definitely not be usable in time for me to exercise that night. So: pragmatic program, must be running in a very short time (my wife was laughing out loud, by then) and improve as need arises. This looks like a job for... Python!

Python is not (yet?) a first-class citizen on the Android, but it's a respectable second-class one, thanks to Damon Kohler and his Android Scripting Environment. ASE lets you run several interpreted languages on the Android, amongst them Python, Lua, Perl and JRuby. However, these are limited on what they can access on the Android API. More specifically, you can't build arbitrary user interfaces or create new activities (though you can invoke existing ones).

Still, having a Python interpreter on your mobile can be handy. I needed to input three sets of repetitions (one for each exercise program), and have Android let me know how many repetitions to do next, and for how long to rest between them. I'm still meddling with this code (trying to weigh making it better versus building a proper app versus actually, you know, exercising), this is just a quick hack I cooked up to get going, but it's growing on me. Anyway, here it is:

from time import sleep
import android

droid = android.Android()

# How long to rest between repetitions
rest = 60
# Warning before starting next round
wake = 10

# One line per exercise set, each number of repetitions separated by spaces
# For instance (pushups, situps, squats):
# 10 12 7 7 >=9
# 9 9 6 6 >=8
# 19 24 19 19 >=27
user_input = droid.getInput("Series", "Describe all repetition sets in your series:")
series = [i for i in user_input["result"].splitlines() if i.strip()]

def interval(theres_more=True, rest=rest, wake=wake):
    droid.makeToast("Rest for %d seconds..." % rest)
    sleep(rest - wake)

    if theres_more:
        droid.makeToast("Ready? %d seconds to start!" % wake)
        droid.vibrate(500)
    sleep(wake)

    if theres_more:
        droid.makeToast("Go!")
    droid.vibrate(3000)

for s in series:
    droid.getInput("New series!", "Ready?")
    sets = s.split()
    l = len(sets)
    for n, repetitions in enumerate(sets):
        droid.getInput(repetitions, '(press Ok when finished)')
        interval(theres_more=(n+1<l))

droid.makeToast("w00t! Congratulations!")

(you can also download it here)

As explained in the comments, it initially expects lines containing the number of repetitions on each set. So, if I'm undertaking pushups, situps and squats (respectively), I might input:

10 12 7 7 >=9
9 9 6 6 >=8
19 24 19 19 >=27

Of course, ">=9" is not a number, but the script will use whatever you input there as labels for prompting you to perform your repetitions.

You'll notice that the script uses getInput for displaying messages when it expects the user to press "Ok" (even though it doesn't expect any typed input at all). That's because, currently, getInput is the only graphical widget provided by the the Python proxy for the Android API. But more on that later.

So, try it out (if you're willing to exercise at all, or if you're just curious), and let me know what you think! Did it help you exercise?

comments Comments