Silly Simple Twitter Search App with Javascript

March 13th, 2009

I’ve been meaning to spend some time playing with twitter’s API. So, I decided to build a really simple js app on top of the search api.

Essentially, I want to show a list of statuses from a search, say ‘baroquebobcat’ and do it in the least amount of effort possible.

There are more complicated ways to do it, but I wanted to see how easy it was.

First, I thought I would do what the API docs suggest and poke around with curl.

Since all I want is to do a search, like you would using a browser, why not try that, only with json, for extra awesomeness.

$ curl 'http://search.twitter.com/search.json?q=baroquebobcat'

Running that gets me a long string of json:

{"results":[{"text":"Talkin' 'bout sandwiches", ...

Breaking that down with structurally looks like

{
  "results": [
    { "text":"Talkin' 'bout sandwiches",
      "to_user_id":null,
      "from_user":"baroquebobcat",
      "id":1307603293,
      "from_user_id":125785,
      "iso_language_code":"en",
      "source":"web<\/a>",
      "profile_image_url":"http:\/\/s3.amazonaws.com\/twitter_production\/profile_images\/79555083\/face02_normal.jpg",
      "created_at":"Tue, 10 Mar 2009 22:04:12 +0000"
    },
    ...
  ],
  "since_id":0,
  "max_id":1307851898,
  "refresh_url":"?since_id=1307851898&q=baroquebobcat",
  "results_per_page":15,
  "next_page":"?page=2&max_id=1307851898&q=baroquebobcat",
  "completed_in":0.033281,
  "page":1,
  "query":"baroquebobcat"
}

Results contains the statuses that match the search, the other 1st level attributes are metadata about the search.

For my silly simple js app, all I really need is a subset of this vertiable bevy of infos.

I am not even going to bother with the search metadata and am going to throw away most of the statuses data too.

For now, I only care about the text, and the user who tweeted it.

{
  "results": [
    { "text":"Talkin' 'bout sandwiches",
      "from_user":"baroquebobcat"
    },
    ...
  ]
}

So, I have this list and a url to get stuff. How does that go into the javascript?

we need a callback. Because of all that cross site scripting etc stuff, you can’t do ajax requests for data like this
(well you can as long as you proxy it through your host somehow like
this guy ).
So, the way to get all this stuff so it can be executable is to add a callback parameter to the request. It wraps the json with a function call so you can use it in a script tag.

$ curl 'http://search.twitter.com/search.json?q=baroquebobcat&callback=awesome'
awesome({"results":[{"text":"Talkin' 'bout sandwiches", ...})

so in html file can do

and awesome will be executed with the response from the search.


cut to the chase

So, we need a callback.

because we lack imagination, lets call it twitter_callback.


  Super Silly Twitter App
  
  
  
  
  

Pretty cool, huh?

What that does is define a call back that just writes each status in the search with the user’s name: their tweet on each line.

Actually, that is a bit boring.

All we have is the list, and it’s not very portable but it demonstrates how it works.

You could make it more robust by using a javascript function to insert the script tag, thereby allowing you to dynamically set the query.

There are problems with using js to do all the work here. It can be slower and it also can be a security risk–if you don’t trust twitter.

Laters.

Mountain West Ruby Conf Day -1

March 13th, 2009

I am here in Salt Lake City. Chilling at the library because it is a beautiful building and neatly downtown. In the auditorium I can see they have the projector working.

Behavior Driven Behavior Driven Development

March 4th, 2009

Like all true CS aficionados I have a thing for recursively applying layers of abstractions on top of themselves. Makes for interesting stories.

I have been using much more javascript than I ever thought I would. And, I like it for the most part. But, there is this niggly feeling in the back of my head because I’ve been doing so without a snuggly net of tests. Or even specs.

I looked at jsunit, jsspec and Screw.Unit. But, I didn’t really like the way they worked. So, in the usual way of things. I started on my own.

The first thing I didn’t like about the frameworks when I started looking at them was that they tried to avoid adding things to Object.prototype. Which makes it more difficult to do rspec style expectations like(ruby not js)

"foo".should include('oo')

because should is not available to “foo”

mostly, they get around this by using wrapper functions.

expect("foo").to(include("oo")

and

value_of("foo").should(include("oo"))

when what I want is

"foo".should("include","oo")

It is silly, I know but I got use to rspec and I want my javascript specs to be as free of line noise. Hanging stuff off of Object.prototype is the easy way to do this, but it can break things. Mostly for (… in … ) loops. Arguably, you shouldn’t use those for much anyway. They are dangerous in some circumstances. That’s mostly an opinion thing, though.

The other problem I had was finding coherent documentation. All of the js testing frameworks have fairly good documentation, but I had some trouble figuring out what was the most current practice. Some jsspec looks a lot like what I am looking to do. But now I have started, and it is an interesting project.

The project I am using it for is a gomoku game, maybe backed up by sinatra at some point, but for now it is just js.

I am planning on posting some code later, right now it doesn’t even have setup hooks.

Mountain West Ruby Conference

March 4th, 2009

I am going to Salt Lake next week to participate in an awesome conference. And yes this post is just a blatant excuse to post the attending badge.


I'm attending MountainWest RubyConf 2009!

JoCo Woah

March 2nd, 2009

Went up to Chicago and saw Jonathan Coulton on Saturday. It was pretty awesome. Paul and Storm opened for him, as well as singing backup for a number of his songs. What would have been more awesome?

The night before apparently. Two of my fandoms collided in what must have been a sweet explosion of goodness.

Chicago was fun though. We went up on Friday and spent Saturday afternoon wandering. We walked around downtown, and got internet at the library, after trying starbucks–forgetting that they changed wifi carriers and you need a special card now or something. Grr, corporate greed, grr. The library was this big, new building that felt like a WPA project but according to wikipedia was built in ’91.

A small culture shock was seeing several people watching porn on the lab computers. It was a little surprising. And distracting.

I didn’t have a Chicago pizza, but I did have a ridiculously dressed hot dog.

Hot Dog with everything

Hot Dog with everything

Summary:
The show was excellent. Chicago can be very expensive, but the hot dogs are delicious.

Too Many Drafts

March 2nd, 2009

One of guidelines I have been sticking to in my newly reborn blogging adventure is to avoid using drafts where possible.
Honestly I don’t know what I have against drafts, except that I have this pile of them from when I was in Japan, 2 years ago now.
I started this blog then, thinking that it would be good to have a journal of all the things I did while I was there, the second semester anyway. Along the way I left unfinished many stories of my travels. Some more unfinished than others.
That pile of drafts I ought to finish was a part of why I didn’t write much here in the intervening 581 days or so(Date.parse('january 5th 2009')-Date.parse('jun 5th 2007')). Since I have begun blogging again, from time to time I look at the list of old orphaned things and wonder if I shouldn’t post the good ones and get rid of the ones I will never get around to.
So, if you see any posts that start with funny characters and seem to talk about my time in Japan, that’s why.

Keyboards and Doom

February 24th, 2009

So, I have become ambikeyboard. I use Dvorak, like a crazy person on my ‘nix machine at work and qwerty on the windows. It took a while but, now my brain has figured out how to deal, more or less(I am on qwerty atm). But, I also combined this with learning emacs, so now I have the most common emacs commands memorized and I feel fairly comfortable with it–on dvorak. Qwerty + emacs == stumbly fingers.

On the other hand, the key placement for some things makes more sense on qwerty.

maybe I should try vim next. Just to show those motor control neurons who is boss.

Weather that’s too mild

February 21st, 2009

Weather here in the midwest is mostly boring. Except for the whole freezing rain thing.
I miss real snow.
like when I took this picture.

Bike on Main St Bozeman

Bike on Main St Bozeman

Much better.

I guess it might snow here tomorrow, but I bet they won’t even need to salt the roads.

Bah.

2009 hairdoos

February 18th, 2009

So, now that I have been posting semi frequently, I am starting to get a bit of traffic. Like ~10 hits per week, but that is better than any previous point. But, the weirdest thing was this one search term,’2009 hairdoos’. Now I know from doing some web analytics stuff at work, that weird search terms are not particularly unusual, but–huh?

Dreamhost Passenger Sinatra or more yak hair

January 30th, 2009

In my ongoing adventures with Sinatra and shared hosting, I have discovered a number of new ways to shave a yak.

My preblog project warmup has been taking shape over the past week or so as a microapp that scrapes my library account and gives me a convenient way to find out whether I have any books overdue or any holds that are ready for pick up. The actual application was easy. Use scrubyt + modifications(firewatir is not a good dependency on a server) to login, grab my checked out books, etc and store them. Then, serve that info up on a simple web page.

The Problem: dependency hell + funny runtime environments.

Initially, I had bmizerany’s sinatra-0.9.0.4 vendored, because I wanted to be one of the cool kids and grab things off github. It worked beautifully, ran nicely, specs passed–on my machine. On the server, it kept giving me this error

Exception NameError in Passenger::Rack::ApplicationSpawner (uninitialized constant Rack::MethodOverride)

After some searching, pain etc, I discovered that the problem was that the rack on dreamhost is 0.4 and sinatra0.9.x.x uses rack 0.9+. So, I tried installing the new rack locally and using the GEM_PATH env to pick it up. Unsurprisingly, this failed. I tried a number of different tactics like using the apache setEnv directive and using ruby’s ENV hash. None of it worked.
Eventually, I gave up on getting 0.9.x.x running and replaced my vendored copy with 0.3.3

$ sudo gem install -v 0.3.3 sinatra
$ cd /path/to/app/vendor
$ rm -rf sinatra
$ gem unpack -v 0.3.3 sinatra
$ git add *
...
$ git commit -a -m 'downgraded sinatra'
cap deploy

I had to change my config.ru to be sensical to sinatra as it was, but it worked.

Somehow that doesn’t sound as annoying as I found it. I spent several hours working this out, and now it works more or less perfectly. Passenger can’t run the scraper because scrubyt also requires hoe >=1.5.0 which dreamhost doesn’t have in their shared gems(theirs is 1.2.1). It can’t see my local gems, but I set up a cron job which should run as me and thus be able to see it.