Inventing schemes is not really cool

Android Developers Blog: Integrating Application with Intents — a nice writeup about really, seriously underappreciated Android feature, Intents. Somehow it's reminiscient of the revolutionary Magic Ink proposal, albeit on a push, not pull basis.

Having said that, I can't shake the feeling that making up URI schemes is not cool. Especially when they're made up to go with an Intent action which is actually redundant with the scheme. URI should point to a location, not embody action; actions should go in the action field (hey, action–action, ya see? almost like it was designed for that!).

OpenSuSE 11.2, ath5k and channel 13

Kernel 2.6.28 adds brand new WiFi regulatory domain handling; so if you upgrade from OpenSuSE 11.1 to 11.2 (and so from 2.6.27 to 2.6.31) you get this shiny new (FCC) sheriff's badge right there too. This is all fine and cool, only the userspace part is utterly broken in SuSE, so you're limited to world regulatory domain.

Ah, well, downloading Lineville's regulatory.bin fixes it. (Only it's a bit hard if you have NO FREAKING NETWORK because you use channel 13 on your AP and your OS decides that it's no channel 13 for you.) Or does it? Wait, it only gets better.

Now, apparently, your network card feels a citizen of the world. It doesn't matter you're an EU citizen in the EU and that the computer has been bought in the EU. EEPROM says world regdom, the kernel must obey. So you end up patching the driver to report correct regdom to kernel (because it obviously won't just let you override it, you malicious, evil bastard, you). I sure hope you have that kernel source around! And development tools. And stuff. Although I'd advise against those -s in the patch, they seem to break it further, dunno why.

Apparently you can use ath_info to patch up your EEPROM to report correct regdom, so you don't have to patch up ath.ko every time a kernel upgrade comes down the line. Fortunately the 11.1 repo's version still works. (It's regdom 0x37 for EU.) Just that... the card refuses to write it. So you're stuck with patching. Or just shut up and don't use channel 13. Which might explain exactly why it's usually less crowded. (Which, incidentally, is exactly why you'd want to use it.)

Oh, wait, you can enable EEPROM writing on AR5110. You just need to pull the GPIO EEPROM write enable pin. Of course, there's no datasheet anywhere in sight. You're lucky your humble editor took the risk and brute-forced all 10 pins (ok, not all, but he was ready to) and found the magic combination. It's # ath_info -g 1:0 -w <base-address> regdomain 0x37. lspci -v will tell you the address.

Broken SPAA on Radeons with analog panels

If you have a Radeon with analog LCD panel and after upgrading something (possibly the radeon driver) SPAA suddenly no longer works for you in Qt4, try the following:

$ echo Xft.rgba: rgb >> ~/.Xresources
$ xrdb -merge ~/.Xresources

I'm too pissed off by this and debugging it for two days to offer any explanation (just to note, radeon's Option "SubPixelOrder" doesn't work; if you feel like digging, see qfontengine_ft.cpp, libXrender's and libXrandr's source). Also, YMMV. If it works you're on your own to make sure it gets loaded properly.


Hunky Punk — crawl dungeon on the bus

How deep dare you go?

Interactive fiction interpreter for Android released! Look at the code or just grab the package at Market.


HTC Dream's notification LED color.

Android documentation says:

public int ledARGB

The color of the led. The hardware will do its best approximation.

Best approximation my ass. I wanted #fb2a0c, I got pink. To get the orange color, I needed #080800, for which orange is most certainly not what I'd call best approximation. It seems that Dreams' drivers just feed the raw RGB values to the hardware without bothering to do any processing on them.

To make it easier to find out which color to feed your device to get the real best approximation of color you want, I made this quick and dirty android LED tester. Just slide the sliders until you're satisfied, note the values and get on with it.

As a note, you might want to only serve the fake colors to devices you tested them on, in case other devices handle this properly and your app would turn out to blink the LED in real #080800, which is to say, almost black. I do this with if (Build.DEVICE.equals("dream")) color = 0xff080800; else color = 0xfffb2a0c; obviously if you test on more devices and they're all equally quirky, you need to add more checks.


Making Android application icons

Making application icons which would fit in style of the original icons on Android isn't at all easy. I've found it's easiest to model icon in 3D and render it. In the best interest of aesthetically-sensitive android users worldwide, I'm making available this android application icon template you can use to make your new beautiful icon.

It's got the camera, lights and whatnot setup just the proper way. It's still a little bit off the original icons—the shadow could be softer, for instance—but works good enough for me after some time struggling with blender's clunky interface. If you can fix it, great—the drop has guest upload enabled, so feel free to share your improved version.


Extracting 9-patches from apk files

As you may or may not know, 9.png files in compiled android packages have the nine patch metadata info rolled from the image OOB into the PNG file. Following quick and dirty ruby script extracts it back.

#!/usr/bin/env ruby

# Rafał Rzepecki 
# public domain
# deserializes metadata of 9-patch png file
# optionally writes out png with 9-patch info embedded (needs imagemagick for that)
# quick and dirty hack, no error handling, almost no test, YMMV
# for format specs see android/platform/frameworks/base/libs/utils/ResourceTypes.cpp
# (in android platform source)

if ARGV.length == 0
    print "Usage: #{__FILE__} <serialized nine-patch png file> [optional output png with inline 9-patch info]\n"
    exit 1

filename = ARGV[0]
png = File.open(filename) { |f|f.read }
index = png.index 'npTc'
data = png[(index+4)..-1]
wasDeserialized, numXDivs, numYDivs, numColors = data[0...4].unpack('C4')
paddings = data[12...(12+16)].unpack('N4') #left right top bottom
xDivs = data.unpack("N#{numXDivs}")
yDivs = data.unpack("N#{numYDivs}")
colors = data.unpack("N#{numColors}")

print "was deserialized: #{wasDeserialized}
paddings: #{paddings.join(', ')}
xdivs: #{xDivs.join(', ')}
ydivs: #{yDivs.join(', ')}
colors: #{colors.map{|c| "#%08x"%c}.join(', ')}

if ARGV.length == 1
    exit 0

# quick and dirty
`identify #{filename}` =~ /PNG (\d+)x(\d+)/
w, h = $1.to_i, $2.to_i
`convert #{filename} -bordercolor white -compose Copy -border 1x1 -stroke black \
-draw 'line #{xDivs[0] + 1},0 #{xDivs[1] + 1},0' \
-draw 'line 0,#{yDivs[0] + 1} 0,#{yDivs[1] + 1}' \
-draw 'line #{paddings[0] + 1},#{h + 1} #{w - paddings[1]},#{h+1}' \
-draw 'line #{w+1},#{paddings[2] + 1} #{w+1},#{h - paddings[3]}' \


SIM contact list query doesn't support filtering

content://sim/adn doesn't respect the where clause when querying it. Just so that someone doesn't spend as much time wondering why the freaking thing wouldn't work as I did.

You have to filter it another way. I did by creating a custom CursorAdapter to filter it, but your mileage may vary. It might be better to make a cursor wrapper, or something. Or is there something like that in the library already, and I just missed it?


Launching applications in Android

To launch an application programmatically in Android given an ApplicationInfo, do:

ApplicationInfo ai;
PackageManager pm = getPackageManager();
try {
    Intent i = pm.getLaunchIntentForPackage(ai.packageName);
} catch (Exception e) {
    Toast t = Toast.makeText(this, "Couldn't launch the application.", Toast.LENGTH_SHORT);

Apparently this was harder pre-1.5, where you had to look for the proper activity yourself.

Blogger's data:blog.feedLinks

As you might have noticed, I got sick of the crufty, old, temporary-turned-permanent template here and been playing with making a new one from scratch. So you might expect this place to look even worse and more half-baked for some time to come.

And while I was trying to get the RSS links in the header, I came across the data:blog.feedLinks thingy. I've been extremely puzzled as to why couldn't I <b:loop> it... turns out it's not a list, but a piece of HTML code. Just smack the <data:blog.feedLinks/> in your <head> and be done with it.

To end this rant, I'd really appreciate if blogger had anything resembling a proper reference instead of this freaking I-m-so-stupid-I-can't-use-a-reference-and-need-mom-to-answer-me-questions freaking FAQd up 'online help'.

EDIT: BTW, no, no comments for you. We'll see if it's to stay. Mostly depends on how fed up I'd be with the blogger API, I guess.


JFS external journal devices on LVM

JFS allows using external journal devices, but has trouble finding them when they are located on LVM (as of jfsutils-1.1.13). It turns out it doesn't search proper dev directories when searching for device by UUID.

It does search /dev/evms though. Simple workaround is to ln -s mapper /dev/evms.


Power management in Android's kernel

Google was going to be an interesting case of a large company hiring people both from the embedded world and also the existing Linux development community and then producing an embedded device that was intended to compete with the very best existing platforms. I had high hopes that this combination of factors would result in the Linux community as a whole having a better idea what the constraints and requirements for high-quality power management in the embedded world were, rather than us ending up with another pile of vendor code sitting on an FTP site somewhere in Taiwan that implements its power management by passing tokenised dead mice through a wormhole.

To a certain extent, my hopes were fulfilled. We got a git server in California.



Restrained Life Viewer for Linux

Current (1.15.2_LL-1.21.6) binary distribution of Restrained Life Viewer for Linux is broken. The binary is compiled with an extra flag which is not in the settings file, which makes the client crash when trying to wear a blindfold.

Luckily this is easily fixed. The patch below makes you able to enjoy being blindfolded also in Linux.

--- RLV-1.15.2_LL-1.21.6/app_settings/settings.xml      2008-10-25 22:03:09.000000000 +0200
+++ /home/divide/games/SecondLife-i686-   2009-03-11 13:58:29.000000000 +0100
@@ -12,6 +12,17 @@
+    <key>RestrainedLifeNoSetEnv</key>
+    <map>
+      <key>Comment</key>
+      <string>Toggles the RestrainedLife atmospheric effects restraint, needed for eg. blindfolds. (False means blindfolds work.) Needs a restart of the viewer.</string>
+      <key>Persist</key>
+      <integer>0</integer>
+      <key>Type</key>
+      <string>Boolean</string>
+      <key>Value</key>
+      <integer>0</integer>
+    </map>


The sound of silence

From "The Power of Now" by Eckhart Tolle:

Every sound is born out of silence, dies back into silence, and during its life span is surrounded by silence. Silence enables the sound to be. It is an intrinsic but unmanifested part of every sound, every musical note, every song, every word.

It somehow made me think of John Cage:


Probably the most beautiful short I've ever seen

(Via ^eirena.)

Snapshotting and cropping widget

I wrote simple Qt widget that is able to take an image of its own contents, and then crop it. (I use it with a video player widget to allow the user to take a snapshot of the video stream.) You can find it at the repo. It's not perfect, but it's what I need. Feel free to extend it, the mob access is open.

As an example (no code at all, all designer point-and-click), consider the following window:

Now if you click the "Snap" button (which is connected to the CropWidget's snap(bool) slot), the controls are no longer interactive (and are, in fact, just images of controls). You can use your mouse instead to draw a crop rect:

And then, by clicking "Crop" (connected to crop(), simple), get it... you guessed it ;)


Synaesthesia resurrection

Synaesthesia may well be the most beautiful and meaningful music visualization I've ever seen.

Unfortunately the project seems abandoned—it's twelve years old and last release was in 2005. As such it's a little bit hard to use nowadays; well, you can pipe sound in just fine, but what really lacks is the ability to record a video of the visualization.

This is especially important for me because I'm using a Polish Twitter-like thingy which makes it much more straightforward to embed videos than sound; every time I find myself wanting to share a song I'm listening to I have to find a video on YouTube, and when there is none, I feel compelled to share a synaesthesia video.

So I took the matter into my own hands and hacked together raw video output for synaesthesia. You can find the code at the repo; feel free to tinker in it, you may even commit anonymously (just use the mob account and branch).

I can now do
mpg123 -s song.mp3 | ./synaesthesia --width 640 --height 360 --output-raw 25 pipe 44100 | nice -n10 mencoder - -demuxer rawvideo -rawvideo w=640:h=360:format=rgb32:fps=25 -oac copy -audiofile song.mp3 -ovc lavc -o song.avi
and get a video like this one:

The only thing that needs doing is making it possible to visualize asynchronically. Currently it still generates the video in real-time while playing and this sometimes causes dropped frames (visible in the video as some hickups, especially at the beginning).