The random rantings of a concerned programmer.

lawds animu bullshit

June 08th, 2009 | Category: Random

One of the many things I hate is having to keep up to date on downloading animu bullshit. I want to be able to come home from work and have something say “OH HEY THE LATEST EPISODE OF EDEN IS OUT AND OH BTW I ALREADY DOWNLOADED IT FOR YOU” so I can focus on more important things like taking off my ID badge and whipping my dick out.

I’m sure I’ve ranted about wanting a system before so I’ll spare the gory details. I’ve got the first half done — it pulls the RSS feed from Baka-Updates (since they publish in an easily parseable format1) then dumps any new entries into an Sqlite3 database. I’m half-tempted to scrape all the torrent data off Baka-Updates but I’ve found a lot of that shit gets out of date pretty fast (trackers change, takedowns, etc).

Also, fuck the goddamn Gist plugin for not displaying a vertical scrollbar and just shitting all over the page layout in general. [view source]

5907

Just two components left to write — a utility to maintain a list of series to track (and drop the .torrent file into rtorrent‘s watch folder) and another utility ala xeyes that lets me know when the shit is done in a reasonable manner2.

Also, I’m going to write my own fucking Gist WordPress plugin that fucking proxies the cross-domain request and buffers the result so the fucking JavaScript doesn’t have to use a slow-as-fuck JSONP request to GitHub’s slow-as-fuck services.


References:
[1] parse_title = head . (=~ "\\[([^\\]+)\\](.*) ([0-9]+)?(v([0-9]+))?")
[2] dicks everywhere

6 comments

(Untitled)

April 18th, 2009 | Category: Random

FFFFFFFUUUUUUUUUUUUUUUUUUUUUUUUUUUU

I just discovered a really fucking massive huge flaw in my shitty ORM shit. The ORM layer is supposed to lazily resolve foreign references, right? So if you have a schema

You’ll notice that the post table contains a self-reference. So the ORM layer will generate the following code for that table –

The own-table foreign key, post_parent is a killer, because it causes parseSql' to lazily enter an infinite loop. While this isn’t bad, when you throw that DbRecord to the automatically-generated JSON thingy, it tries chomping through the entire thing and gets tangled up.

There’s no good way to fix this, aside from doing a topological sort of all the inter-table relations and having the ORM layer go “oh hey” when it starts to loop. That kind of functionality could be added to either the parseSql' function above, or to the toAscListJSON function, which converts the DbRecord into a JSON-injestable form (and should be replaced by using Data.Data instead…)

Realistically, I should throw this shit out and do it some other way, especially since the code generation bit has become pig disgusting. Fuck! (Eventually I’ll rewrite it using Template Haskell or something).

No comments

(Untitled)

April 06th, 2009 | Category: Random

Well I feel really stupid.

I spent the last couple of days writing a shitty ORM layer in Haskell. Basically, I didn’t like passing around opaque [SqlValue]‘s everywhere (or [Data.Map (String, SqlValue)]‘s) so I wrote a bit of code which takes a schema definition and outputs both the SQL to generate the schema in the database and a Haskell source file which defines a bunch of records for each of the tables.

Basically, transforming this:

Into this file which is too long to paste (and the SQL is generated too but SQL is boring).

The first little nasty I ran into was “oh hey I want to be able to serialize these records into JSON”. So I threw wrappers into the generated code (it took like 6 lines of Haskell to generate the serializers), not a big deal.

And now I’m realizing that HStringTemplate, the engine I want to use to generate the HTML, isn’t going to want to eat these records. It wants associative arrays. At this point I’m kind of going FUUUUUUUUUUUUUUUUUU, but realistically, since I have JSON representations of everything I can simply do let toAssocList = decode . encode. That’s right. Encode the record into JSON, then let Text.JSON turn that into an associative array.

Realistically I should just write a bit of code to convert each record into an associative array, then use that code in the JSON serialization stuff. Actually that’s not a half-bad idea. Let me do that now.

It may be LOL BLOAT, but it makes writing the application code really straightforward (so far, at least) –

2 comments