I ditched work a little early today. I’m migrating our website over from a pig disgusting Cold Fusion mess over to pig disgusting Drupal. It’s a fucking nightmare — how the hell did I turn into a fucking PHP developer? Times like these make me want to build up a non-webshit portfolio and go shopping for a new job, though the market around here is kind of shitty for that right now.
Started tinkering around with Haskell + OpenGL last night. Using some old C code of mine as a reference, I got textures and vertex arrays working fine, should hammer out VBO’s and I should have everything I need to render with. Don’t know what kind of game or program I’d write though, so I’ll probably just get bored with it. I put the shit code on github because as shitty as github is (slow as fuck Ruby web interface) it’s kind of nice to have a free remote, public VCS. And git itself isn’t too bad.
Right now I’m compiling CURRENT from source, over NFS, on a netbooted machine (because I forgot the goddamn install CD at work fucking AGAIN and I’m sick of having shit strewn around my desk). It’s pulling the source off NFS, but /usr/obj is on a disk. Figure I’ll buildkernel + buildworld then install them to the disk, then boot off that. Pulling the source over NFS takes surprisingly little bandwidth — I assumed the network link would be the bottleneck. I guess it’s pulling on-demand, so it just adds latency to each request. Should have unionfs’d the UFS mount over the NFS mount and pulled all the stuff over concurrent to the install (via find . | xargs touch or similar).
Took me fucking ages to sift through all the shit I post to find the nfsbooting notes from 2007 though. I guess I should start tagging posts or something.2 comments
Fffh, so we finished our project on Chaotic Audio Synthesis, and the application itself turned out pretty fucking nice. Not only is it really fun to play around with, but it actually turns out sounds that aren’t too bad. Chaotic, but not that bad.
The real problem was giving the presentation (still sitting watching other people present their projects right now). I don’t fucking know what happened, but I couldn’t get the application to run on the projector. All the builds I did were hardcoded to 1280×800 fullscreen, and for some reason I couldn’t get my fucking computer to mirror the desktop on both video devices (laptop LCD and the projector).
So I quickly apologize to the audience and do a build with 800×600 windowed resolution, run it, and drag the window to the other monitor. Which worked. The problem was, for some fucking arbitrary reason the GUI decided to kill itself in that configuration. I have no idea why, probably something weird and arbitrary (it works fine on my machine now).
Despite our fucking fail presentation skills, I think everyone was fairly impressed with our application. It’s a fucking awesome thing, makes pretty pictures and not too bad randomly-generated “music”. I’ll package it up in a couple of days (going to hack the config file to include resolution information) and post it here, but for now just pictures.
So I’ve been working on a semi-semester long project which is due tomorrow evening. The idea is to take a chaotic simulation, in our case a homebrew particle engine, and from it generate audio. Kind of like some slick auto-audio synthesis machine.
We made a 2D prototype of the particle engine for the abstract demo, got it to render quite a few particles pretty easily. The final product is supposed to render in 3D (which, as far as particle engines go, isn’t that big of a deal), but we’ll see how that turns out. I have a feeling that, even with 10000 particles in an enclosed region, it might feel a little sparse compared to the 2D example. But then again, I dunno.
So, in the meanwhile I’ve been tinkering with the win32 MIDI interface, which is actually not too bad. The problem I’m running into is that the chaotic simulation produces fairly chaotic “music” (read: ear-shattering) because I’m basically just passing the raw events into the MIDI controller without really caring about them. Thankfully my partner knows a thing or two about music and wrote some code which takes the unfiltered values and maps them to notes on a scale. His prototype app which just generates random values with rand() filters them, then sends them to the MIDI device actually sounds quite good.
Hopefully I’ll get that integrated with the 2D version soon; I think I’m going to branch the 2D and convert it over to 3D (with billboarded sprites!), so that’ll be fun. Woot woot.
And once it’s done I’ll put up a download link for the lulz or something.