Not software engineering’s finest moment.
Since the oggOmatic uses a C3, it doesn't have very much floating point processing power to throw at decoding .ogg files. So you can imagine that I was quite pleased to see that xiph has an integer-only version of their ogg decoder. So, when I reformatted the disk on the oggomatic (the jukebox was about 1gb too small) I installed igg123 instead of ogg123, and, wanting to bask in the better performance of an integer decoder, I fired it up and ran top to watch.
Hmm, that was funny; it seemed to be taking 15% of the cpu while the floating point version was only taking 10%. So I compiled ogg123 (I didn't use the prebuilt port, because it wanted me to load up a wad of other decoders, and I definitely didn't compile the port because ports are broken and it would try to drag in 700mb of gnuware to build those stupid other decoders; fighting with the new! almost portable! configure was pretty small potatoes compared to this) I got the chance to time each of them:
igg123 (integer party) 293.36 real, 38.11 user, 2.31 system.
ogg123 (floating point party) 293.30 real, 31.23 user, 2.34 system.
Timed them again. Same numbers.
That's pretty amazing; integer math being slower than floating point math on a machine with a sucky floating point unit. (yes, I know that integer ops are generally slower than floating point ops, but (a) decoding a .ogg file seems like it's something that could be easily optimized and (b) the C3 has really sucky floating point, so I'd think it could do better.) Perhaps this integer version is for running in integer-only processors, and they just punted on performance so they just get it done now.