The random rantings of a concerned programmer.

Archive for January, 2009


January 31st, 2009 | Category: Random

oh fuck the new version of Xorg is completely fucking broken — there’s something fucked up with hald or the xf86-input-mouse driver or something diddling up in the old USB stack. At times the screen will just stop being fucking drawn until I move the mouse.

  • Whenever I do a scroll the scrollwheel a single notch, the entire thing freezes until the next mouse event.
  • Double clicking doesn’t fucking work most of the time (freezes screen until next mouse event).

Fucking balls this is gay.

UPDATE: Recompiled x11-servers/xorg-server with WITHOUT_HAL (uncheck option in `make config`), then added

Option "AllowEmptyInput" "off"

To the ServerFlags stanza, and now everything seems to be back to normal, hooray.



January 30th, 2009 | Category: Random

I rm -rf /usr/local last night because my poor laptop was getting bogged down with too much bullshit software (it was to the point where portupgrade took more than 8 hours to complete, which is retarded). In non-BSD speak, this means I reduced my machine to a base install, removing all installed packages.

I fetched a new copy of the source tree and updated my kernel from 7.1-PRERELEASE to 7.1-RELEASE-p2 (about time lawds), updated my ports tree and started installing shit –

  • ports-mgmt/portmaster
  • editors/vim
  • sysutils/ezjail
  • x11/xorg-minimal

and then I went to bed. Fucking Xorg takes like a billion hours to compile (and installing from packages is more of a pain than it’s worth). I came to work this morning after it had finished and tried to get going with a startx, which failed — forgot to install x11-drivers/xf86-video-ati, of course.

After I got the driver re-installed, X came up, but neither the keyboard nor mouse worked. FFFFFFFFFFF. After a bit of digging (and reading UPDATING) I figure out that fucking Xorg is hell-bent on using hald instead of the shit that’s in the xorg.conf, so I kickstarted that shit and now everything works.

Fuck yeah Seaking.

Also I got paid like $1500 today and I’m gonna burn it all on hookers and blow tonight.



January 25th, 2009 | Category: Random

I ended up just using Text.Regex to parse the Symfony routing.yml bullshit. Basically, I’ve got a bit of code which reads in some routing Yaml like

  url: /index
  param: { module: index }

  url: /tag
  param: { module: tag, action: list }

  url: /tag/:id
  param: { module: tag, action: view }

Then as a build step parse and transform it into a couple tables which get statically linked into the application –

module Site.RoutingTable where
import qualified Site.Index
import qualified Site.Tag
import Data.Map (fromAsc)
import Routing (Route (..))

actionTable = fromAscList [
        (("Index", "index", Site.Index.index),
        (("Tag", "list"), Site.Tag.list),
        (("Tag", "view"), Site.Tag.view)]

urlTree = Branch ([Branch ([Literal ("tag",RouteEnd (Just [("module","tag"),
("action","list")]))],Just (Parameter ("id",RouteEnd (Just [("module","tag"),
("action","view")])))),Literal ("index",RouteEnd (Just [("module","index")]))],Nothing)

Basically, the urlTree is a decision tree which contains all the possible paths through the tree –

- index - end
| tag   - end
        | :id - end

When a request needs to be routed, the URL is split with the '/' delimiter into a list of strings. The pop is popped off the front of that list at each level of the tree until it hits an RouteEnd element or a Branch element with no corresponding Literal and no Parameter or Wildcard part (in the later case, the route fails and a HTTP 404 response should be returned).

Parameter and Wildcard nodes are only taken when there isn’t a competing Literal node. When a Parameter node gets taken, the value it corresponds to gets pushed onto an associative list for processing later on.

When the routing process hits the end, the RouteEnd node contains the parameters specified in the Yaml. Those parameters are merged with the parameters generated by the Parameter nodes in the tree traversal (but will not overwrite — the parameters in the Yaml simply serve as defaults).

Thus we can route a Url -> Parameters where Url :: String and Parameters :: [(String, String)].

Next, we need to resolve what code to execute. Contained within the parameters be a "module" and a "action" parameter — these are resolved in the table to a function pointer within the application’s namespace. The application functions are all existentially quantified as

data Action = forall a. IConnection a => MkAction (RouteParameters -> a -> String)

– they get passed both a HDBC database reference and the URL parameters and return a string which gets output to the browser.

A good use of a weekend, I guess. [source]

1 comment

Haskell + Yaml

January 24th, 2009 | Category: Random

Ugh, I’m trying to parse some Yaml shit for some application in Haskell, and I can’t fucking do it. I found two existing parsing libraries, YamlReference and HsSyck.

I can’t make heads or tails of YamlReference.

And (after much frustration when unable to figure out why the balls my “am I too stupid to use this library” test kept breaking) I’ve decided that HsSyck is broken –

ghci> import Data.Yaml.Syck
ghci> parseYaml "x:\n  y: hello\n  z: {bampu: pantsu}\n"
-- xbox huge output
ghci> emitYaml it
"--- \n? \"\\0\"\n: \n  ? \"\\0\"\n  : \"\\0\\x13\224( \"\n\n  ? \"\\0\"\n  : \n    ? \"\\0\\x13\224( \"\n    : >-\n      \NUL\DC3\224( n\n\n"

At this point I’m just going to say “fuck it” — the shit I need to parse is simple enough that I can probably whip up a couple regexes to grab the shit I need out of it. Annoying as fuck but I’m not nearly smart enough to write a compliant Yaml parser (it’s complicated as shit) so I can’t do shit about it.

EDIT: Actually, it looks like the reason my parser failed is because the Yaml snippet

  y: hello
  z: { bampu: pantsu }

gets converted into, effectively:

[("x", [("z", "bampu"), ("z", [("bampu", "pantsu")])])]

(for reasons I can’t fatholm) and my parser shits itself because “z” is in there twice for some strange reason (and that reason is probably in the Yaml spec). The emitYaml function must just be completely fucked (or outputting horrendous yet correct Yaml). God this shit is retarded.

Comments are off for this post


January 23rd, 2009 | Category: Random

One of our legacy machines at work got hacked again today (it’s not a machine I have any control over). There was an old Perl script with the following code –

@QS = split(/&/,$QS);

foreach $i (0 .. $#QS) {
        $QS[$i] =~ s/\+/ /g;
        ($key,$val) = split(/=/,$QS[$i],2);

        $key =~ s/%(..)/pack("c",hex($1))/ge;
        $val =~ s/%(..)/pack("c",hex($1))/ge;

        $QS{$key} .= "\0" if (defined($QS{$key}));
        $QS{$key} .= $val;

open(FILE,"cat $filedir | tr -d '\012' |");

which, hurr durr allows you to execute arbitrary shell commands by giving it an ID with some pretty backticks. The attacker was nice and decided to `wget` (or whatever) and threw a rather shitty web console on the machine. He might have done some other nasty shit (because once you can execute arbitrary shellcode the next step is to get root access — and if you’ve got root access you can cover your tracks pretty damn well) but it’s not my machine so I don’t give a damn.

Some interesting points –

  1. The web server user, apache, has ownership of all the files in their webtree, which means it can write to any of them and create files wherever it wants in there. Hurr durr.
  2. They use Subversion to keep content updated on the machine, and their authentication information is cached on there (and belongs to the apache user), so if you got local access you can check out all the code running on that machine for your local viewing pleasure (and can probably commit it back, though someone would notice). Hurr durr.
  3. This isn’t the first time this attack vector has been used — someone got in before and did the same thing. The sysadmins just removed the files that were dropped in and brushed the incident under the rug (or something?)

Megassa facepalm nyoro~n.

Comments are off for this post

Next Page »