Archive for October, 2008
So it turns out that the chick I was fucking has fallen in love with me (or some stupid shit), which means I have to get rid of her. The sex wasn’t very good anyway — I always ended up doing all the work. I get more pleasure fucking Rosy Palmer.
I gave a demo of my Business Workflow Engine on Thursday. They actually liked my design a lot — using an expressive language rather than raw XML “programs”, storing state to disk with a flat-file database rather than an RDBMS, non-destructive updating so workflow state could be reconstructed, etc. What they didn’t like was my SOAP library.
SOAP isn’t a complicated thing. It’s just a specification for sending an XML payload. It’s really fucking simple.
The standard method to dynamically describe that payload, WSDL (and the description of the data types used in that payload, XSD) is complicated as fuck. It’s so fucking complicated, I swear, there’s a fucking standard for what not to do with it (though I can’t find a citation to back that up). My workflow engine was written in C, and requires dynamic invocation of remote services. There are a couple SOAP libraries for C, (csoap, gsoap, Axis2/C) but all of them handle the WSDLs statically — you hand their tools the WSDL file and they spit out a C stub which has to be compiled into your binary. Useless. So I had to write own WSDL/XSD parser and tie it into my own SOAP client library, from scratch.
Which my superiors pointed out, is a maintenance disaster waiting to happen. My shitty non-standard implementation was a little over 2,000 lines of C. To put things in perspective, the Java wsdl2c parser (excluding the part which actually outputs the C from the parsed type information) is 12,000 lines long. And even then, that shit isn’t fully standards-compliant. It’s ridiculous.
But despite that, the Axis2 implementation is fairly “standard” (as in, widely-used, not actually conformant), so the argument is that if we used it, we could just tell people writing services for us to consume “make your WSDL Axis2 parseable”. Unfortunately, only the Java version of the implementation does this shit dynamically, which means migrating the whole fucking stack over to a Java/Tomcat-based implementation. Which means rewriting the whole fucking thing yet again (first implementation was in Ruby, second in C, now the third will be in Java).
Needless to say, I am currently shopping around for a new position. I applied for a Senior Application Developer position in the Health Science library; the main project is writing a scraper/reindexer for medical journals in Python. Sounds familiar. I’m hoping for an offer, but we’ll see what happens :32 comments
Ugh, so I got served up a DMCA notification at work. They let me off the hook with the warning “ONE MORE STRIKE AND YOU’RE OUT” so I’m shitting my pants a bit and trying to snag a new job or something (especially since my 1500-hour contract expires next month).
And then the host I was using for 4scrape bottomed out, saying that I was over my bandwidth limits and dropped me down to DSL speeds. Hurr. Shuugo (of Konachan) was in the process of leasing a beefy server to virtualize and consolidate his assets, so I decided to pitch in with him. I’ve got 4scrape all set up (the image data is being transferred on-demand), but we need to tweak the balls out of VMware before it’s actually usable again. It’s SLOW AS FUCK as it stands.
Chaos;Head needs considerably more violence and gore. They’re just moving the plot along at this point, I think :<9 comments
I got an iqdb interface for 4scrape up and running, which means that you can now use piespy’s much better image search function to search for detexts and stuff. It’ll be another 8 hours or so until he’s got all of 4scrape ingested, so it’ll probably be ready by tomorrow morning. It’s pretty neat.
Another interesting 4scrape thing I noticed today was that Google is learning how to parse the threads. If you look at some of the results, you’ll notice that Google was able to count the number of posts and figured out which one was the most recent. Google has only the tip of the iceberg in it’s index (it returns 7,000 results; there are 22,000 threads and 100,000 image pages), but nonetheless searching for Rozen Maiden turns up relevant results, albeit not nearly as many as it could.
As a final note, 4scrape is up to 1 million pageviews/month, which means it’s generating about $0.20/day in ad revenue! Hooray money!3 comments
I was rambling in IRC the other day about how X11 is balls (I ramble about this a lot) and how bitmap-based displays are so well-entrenched that innovation is staunched. X11 has been around since the dawn of time, and it works. Because it works, historically people have just added more and more cruft onto it (MESA/DRI, etc) and now it’s a hulking mammoth upon which everything seems to teeter.
Now, at the time, I didn’t really have an alternative to a bitmap-based display. The current video output device paradigm — a flat monitor — is itself a bitmap display. It seems logical that your display system, then, would be a simple hierarchical partitioning scheme for that physical display space.
Realistically though, modern hardware doesn’t need to treat everything as a bitmap — video cards are more than capable of dealing with 3D geometry. This birthed compositing window managers (Compiz, Aero, etc), which render the bitmap displays onto a 3D surface, then render the 3D surfaces in a 3D environment. At their core though, the applications are still rendering to a bitmap display.
What if, instead of rendering to a bitmap display, the applications pushed 3D geometry (and transformations to that geometry) to the X server?
Each application/window would consist of a relative coordinate space and associated geometry, data and transformations. Taking a mundane example, an image viewer would want to render a collection of images. Each image has a quad to render it’s texture onto. The application would pass the vertex and texture data to the server once, then alter the vertex data by using simple linear transformations.
Since each application’s coordinate space is relative, we need to fix them to an absolute coordinate space before we can render them. This mapping of application-to-absolute coordinates would be the equivalent of a traditional workspace, essentially providing different views/organizations of the running applications.
Each physical screen, then, would render a workspace (allowing multiple screens to render the same workspace, etc). The screen basically acts as a viewport, having a location and orientation within the absolute coordinate space of the workflow it’s rendering.
As such, this adds another level of complexity to the traditional client-server model that the X Windows System uses — we now have Application-Server-Display. The application provides the server with raw geometry and transformations. The server maintains that state, in addition to the state of workspaces. It provides the data to the display on-demand, based on the display’s viewable frustum.
Basically, another strange idea of mine which will never go anywhere hahaha.
Chaos Head fucking rocks.9 comments
So I hooked up with a girl last weekend, and we traded diseases (both had a throat infection of one type or another, thought it was the same shit but it wasn’t). As such, I only managed to crawl to work (or even out of bed) a couple of days in the past week. Thankfully I’ve stopped coughing up blood long enough to dig through my email, troll the boards and get banned on IRC again.
Since I practically didn’t go to work at all this week (and don’t get paid sick days) my next paycheck is going to be smaller than my dick, which means I won’t be able to afford a new toy I’m looking at. Going to sign a lease on a house for the next couple years; I gathered up a bunch of otaku to live with. Going to be an absolute shitfest. In any case, I want a compact media center, and I figure an AMD Geode LX 800 would make a nice one.
The LX 800 isn’t a powerful beast. From what I’ve heard, it’s got the juice of a P3 wrapped up in a compact package. It certainly can’t decode H.264 video in real time (though one of the MX-series Geodes supposedly has a hardware H.264 decoder; it’s probably significantly more expensive than the LX). I imagine it can run an Xorg server and act as a thin client, allowing me to offload everything to a centralized machine and just stream the frames over a 1000bT Ethernet cable.
But in that case, why bother having the Geode in the first place (unless I buy a lot of Geodes and scatter them around for some reason…)?
I do want a fucking media center again though. In my current setup has the fileserver upstairs acting as a wireless router, and streaming encoded video over a wireless network (down two floors into the basement where the projector is) doesn’t fucking work, and I’m unwilling to run an Ethernet cable that far. Being able to stream video makes the entire setup much more usable, since you can browse and view content from the repository on-demand, rather than having to download it then watch it.