Balding for Dollars Update

balding_after.jpg

Well, my Balding for Dollars event was a success. I have raised $717.37 for Children’s Hospital (there is still time to help bring it to my goal of $1000 if you’d like to donate). Thanks to everyone who helped with donations, words of support, and especially to Frank for his photography, to Berenice for helping with the cutting, to my wife Daniela for her encouragement and help, and to our kids for their patience. I couldn’t have done it without all of you.

There has been one change of plans. I had wanted to donate the hair to a Canadian foundation making wigs for kids with cancer, but couldn’t find it. Daniela’s google-fu skills were better and tomorrow my hair will go off to the Eva and Co. hair donation program. And to clarify, my daughter didn’t shave her head, we’re sending a ponytail from when she went from a long hairstyle to a short one.

Thanks also, to everyone who offered me a toque, but I’m well covered.

Imaginary Gadget: Tricorder

picture-1.png

This is a response to Bruce Sterling’s call for imaginary gadgets.

The Star Trek universe actually had several devices called Tricorders, and I’m not enough of a Trekkie to know what the difference was between them, or even what the “Tri” part was about. What I do know is that they were magical devices which you waved knowledgeably in the air in front of you, which then answered the specific questions about your environment. Sometimes you had to fiddle with them a bit.

Back on Earth, Natalie Jeremijenko’s Feral Robotic Dogs project embeds cheap environmental sensors in off-the-shelf toys and sends them out into the environment. The types of these sensors is growing, you can easily (and cheaply) buy sensors for carbon monoxide, radiation, volatile organic compounds (VOCs), etc. My imaginary gadget would simply to replace the cheap toy dogs with yuppies and technophiles, by making a small device filled with sensors that plugs into their iPhone (or build it directly into Google’s Android platform, or the XO2).

If anyone is interested in making this gadget less imaginary, please get in touch with me. Despite having a Master’s degree in Electrical and Computer Engineering, my practical electronic skills are currently at the level of making LEDs shift colors with my new Arduino. I’m willing to write the web service for co-ordinating all the sensor data data, the iPhone application, and exposing all the aggregate data for nice visualizations in Processing, NodeBox, etc.

Making the invisible visible in our environment can be the first step towards making radical improvements to that environment. Lets see what’s out there, or in Star Trek parlance, “Make it so!”

Tab Dumping with AppleScript and back to Python

Rock

Goal: Iterate through all my (OS X Safari) browser windows and make a list of titles and urls which is then placed in the clipboard ready to be pasted into an email or blog post.

This is an update to Tab Dumping in Safari. That still works well as the basis for extending any Cocoa-based application at runtime, but it relies on SIMBL, which while it is a great bit of code, essentially is abusing the InputManager interface. Some developers and users shun such hacks, and at least one Apple application checks for them at startup and warns you from using them.

I have been running the WebKit nightlies, which are like Safari, but with newer code and features (most importantly to me right now, a Firebug-like developer toolkit). WebKit warns at startup that if you’re running extensions (such as SIMBL plugins) it may make the application less stable. I was running both Saft and my own tab dumping plugin, and WebKit was crashing a lot. So I removed those and the crashes went away. I miss a handful of the Saft extensions (but not having to update it for every Safari point release), and I found I really miss my little tab-dumping tool.

I toyed with the idea of rewriting it as a service, which would then be available from the services menu, but couldn’t figure out how to access the application’s windows and tabs from the service. So I tried looking at Safari’s scriptable dictionary, using the AppleScript Script Editor. Long ago, John Gruber had written about the frustration with Safari’s tabs not being scriptable, but a glance at the scripting dictionary showed me this was no longer the case (and probably hasn’t been for years, I haven’t kept track).

I am a complete n00b at AppleScript. I find the attempt at English-like syntax just confuses (and irritates) me no end. But what I wanted looked achievable with it, so I armed myself with some examples from Google searches, and Apple’s intro pages and managed to get what I wanted working. It may not be the best possible solution (in fact I suspect the string concatenation may be one of the most pessimal methods), but it Works For Me™.

In Script Editor, paste in the following:

set url_list to ""
-- change WebKit to Safari if you are not running nightlies
tell application "WebKit"
  set window_list to windows
  repeat with w in window_list
    try
      set tab_list to tabs of w
      repeat with t in tab_list
        set url_list to url_list & name of t & "\n"
        set url_list to url_list & URL of t & "\n\n"
      end repeat
    on error
      -- not all windows have tabs
    end try
  end repeat
  set the clipboard to url_list
end tell

I had to use AppleScript Utility to add the script menu to my menu bar. From there it was easy to create script folders that are specific to both WebKit and Safari and save a copy of the script (with the appropriate substitution, see comment in script) into each folder. Now I can copy the title and URL of all my open tabs onto the clipboard easily again, without any InputManager hacks.

I had some recollection that is a way to do this from Python, so I looked and found Appscript. I was able to install this with a simple easy_install appscript and quickly ported most of the applescript to Python. The only stumbling block was that I couldn’t find a way to access the clipboard with appscript, and I didn’t want to have to pull in the PyObjC framework just to write to the clipboard. So I used subprocess to call the command-line pbcopy utility.

#!/usr/local/bin/python
from appscript import app
import subprocess
tab_summaries = []
for window in app('WebKit').windows.get():
    try:
        for tab in window.tabs.get():
            name = tab.name.get().encode('utf-8')
            url = tab.URL.get().encode('utf-8')
            tab_summaries.append('%s\n%s' % (name, url))
    except:
        # not all windows have tabs
        pass
clipboard = subprocess.Popen('pbcopy', stdin=subprocess.PIPE)
clipboard.stdin.write('\n\n'.join(tab_summaries))

The remaining hurdle was simply to put the Python script I’d written into the same Scripting folder as my AppleScript version. For me this was ~/Library/Scripts/Applications/WebKit/. When run from the scripts folder, your usual environment is not inherited, so the #! line must point to the version of Python you are using (and which has Appscript installed). You should also make the script executable. Adding .py or any other extension is not necessary.

Overall, while I found AppleScript to be very powerful, and not quite as painful as I remembered, I found the Python version (warts and all) to be easier to work with. Combined with the fact that the script folder will run non-Applescript scripts, this opens up new worlds for me. I have hesitated in the past to write a lot of SIMBL-based plugins, tempting though it may be, because they are hacks, and they run in every Cocoa-based application. But adding application-specific (or universal) scripts, in Python, is pure, unadulterated goodness.

Boffers

Boffers are foam swords for safer swordfighting. Here’s how to build boffers easily.

Make Simple boffers

And here is my first link on the Make blog, also on boffers.

Boffers on Make (my first entry in the Make blog!)

Remember to follow the rules and take care of your sword.

Why computers need so badly to be fixed

The computer has been around for about 50-60 years now. For the past 30 years or so people have been making big predictions about the advance of computing power–computers will soon outstrip human intelligence and go on to form their own digital utopia, etc., etc.

So why can’t my computer remember what I was working on last? Why does it take so long to start? Why, in the 21st century, is it so damn hard to learn to use a new program (or remember how to use one I don’t use often)?

Because computer development stopped in the 1970s, when computers became available commercially. They were extremely primitive devices, loved only by the gadget-obsessed, but for the first time a machine was available which could “learn”–that is it could be programmed to do things beyond what it could do when it was built. The metaphors with human intelligence were intoxicating. But they were, and are, only metaphors.

So computers began to sell (and be marketed). Slowly at first, then with increasing enthusiasm. Computers weren’t ready for the average person (hell, they’re not ready for the average Electrical Engineer), but they sold well. Because they sold well they got cheaper and because they got cheaper (and were intoxicating), they sold even better. So there was no enthusiasm to improve the computer, besides incrementally making it faster, giving it more bells and whistles (thereby increasing its complexity and artificially driving up its price).

Then came the dotcom revolution, when the world discovered the internet. “Internet Time” became an excuse for every possible poor design decision. Suddenly we went from little or no progress, to moving rapidly backwards in terms of usability and design. Every web site has to go through the evolution of the computer until it settles into the inadequacies of early 80’s GUI design. Or <deity> forbid, Flash (but let’s not go there).

Computers are victims of their own success, and we are all victims of the computer. Now we’ve settled into the routine of paying thousands of dollars to test hardware and software for a living–no matter what field we work in. We struggle with our configurations, registries, manuals, and unnecessary, bloated software to try to get our work done, if we can still remember what it is by that time.

And it is only getting worse. Operating systems are now commodities. Operating systems, for crying out loud! How many people really know what the hell the “operating system” is or does? But we all know about Windows, and the Mac, and (increasingly) Linux. Since when should the operating system matter, anyway? It’s just there to give the programs a little boost, help them talk to the underlying hardware. If anything should matter it should be which programs will help you work on your data the best — it’s your data (document, spreadsheet, pictures, whatever) that’s important after all. A really good “operating system” would be one you never see, but which does plenty of work on your behalf — like making damn sure you never lose one jot of your precious work (or time). Instead we have to go through endless gyrations just to keep “operating systems” from screwing up too badly.

I saw a .sig recently which was amusing, “A computer without Windows is like a dog without bricks strapped to its head.” Funny, true, and (unfortunately) you can replace “Windows” with pretty much any operating system you choose to name. Windows is simply the most egregious example, but the best you can really say about <your favourite operating system> is that it’s the best of a bad lot.

What gives me some shred of hope is the open source movement. Not that open source software tends to be more usable, far from it. Linux, Mozilla, and Apache are pigdogs in the usability department. But the original work to create, develop, and evolve computers was done in the public domain, with public funds. It was cannibalized by corporate interests, who boldly stole work done with taxpayers money, broke it, and sold it back to them with a shrinkwrap license.

And then we have corporate executives who complain about the state of what they stole. “The internet isn’t dependable enough for business.” Well, so sorry, it wasn’t made for YOU. “Open source is un-American.” Um, no, taking money from the defence department to build stuff is pretty damn American (although it *is* unusual to build stuff with defence department money which is useful, or works). Unfortunately, stealing from the public commons seems to be pretty American too–just look at logging companies, mining companies, the California energy crisis, or Microsoft.

Oh, back to that shred of hope. Open source may generate a lot of crap, but it’s OUR crap. Anyone can learn from it. It opens the doors to learning and building new things, some of which might not suck. By recreating the public commons we give real innovation a chance. And after all, the people paid for the development of these machines that now rule our lives, isn’t it time to pay them back by making the machines useful?

[Originally published May 13, 2001 on my long-defunct Manilasites blog]

google

google

asus