Monday, March 30, 2009

UI for AR?

Google Alerts are like the best little gnomes, they bring you shiny things, hoping that you like them. Things like "Multiuser interaction with handheld projectors". And the Youtube videos they spawned from.



I commented, I like my gnomes that much:

Interesting. That’s perhaps the very first example I’ve seen of real world tasks being performed in an Augmented Reality environment. I share your hesitance over the actual usefulness of projector technology, but I can easily imagine this kind of research being very useful when applied to a Heads Up Display environment - no issues of privacy there. In fact it could be worse - if two people are wearing HUDs and ‘think’ they’re seeing the same view, one of them could have been hacked, etc and could be manipulated into authorising something they wouldn’t otherwise.

I don't think anyone is thinking much about AR and security yet, unless it's the Defcon folks hacking Bluetooth.

moar on the Vuzix 920AVs

Some happily specific details shamelessly scrobbled from some guy who had dinner with the Vuzix and MetaIO folks at GDC09. He said some other things, something about fMRI-driven missile launch systems, I dunno.


* They don’t know exactly what the price will be, but they are expecting it to be less than $500.
* Paul is very confident that the Wrap glasses will ship this year
* The displays are 800×600 in these glasses. That’s a step up from the 640×480 resolution that their other glasses use.
* The two displays are independantly controllable through a variety of methods, but if your software can handle it, you can provide 60Hz to each eye.
* The IMU for the wrap will include accelerometers, gyros, and magnetic sensors, and will provide yaw, pitch, and roll to the software at a very high rate.
* When they are in visual pass-through mode the Wraps will blend a translucent scene over the world. In this mode the brighter a pixel is the more visible it will be to the user. That makes black the transparent color and white the “visible as it gets” color.
* Paul was coy about exactly what the specs on the camera will be. I think they aren’t 100% settled yet. He was very aware of the issues with frame rate on USB cameras, though, so hopefully they will figure out a way to provide a reasonable frame rate (or at least crisp frames.)

So.. it sounds like they're getting better and more expensive all the time, which is ok with me. I guess they want to dominate the market for a while - my advice is to hurry and get that first-mover coolness established. Perhaps they can be the Apple of AR


Today I scared librarians by talking about RFID tags and security; waxed lyrical about AR goggles and made people jealous of my EeePC. It has been a good day.
Now it is a good day with beer in it. \m/
And I was lying about the missiles.

Friday, March 27, 2009

Sign language gestural interface

I just posted this in a comment on an article about sending sign language video to handheld devices.

How about a sign language to text interface? It could use the familiar skills of the hearing impaired person to read the gestures (with fingertip markers like MIT’s Sixth Sense, or gloves with accelerometers) and output the text to a display on a T-shirt perhaps. This would allow easy communication with non-sign language proficient ‘normal’ people.

They could just use a keyboard I suppose, but this would use their existing skills and potentially offer a totally hands free interface.

Addendum: I worked at a service station once - 6 hearing impaired people rocked up wanting to buy some food and this one moderately good at speaking guy interpreted for the rest of them. It was hard (especially through the intercom/window). This idea could have been quite handy..

Thursday, March 26, 2009

moar Augmented Reality goggles

There's been some more movement of late in the Augmented Reality/Heads Up Display space.

Vuzix have partnered with AR company MetaIO to market an addon for their VR920 video glasses (the older, Star Trek style, non-transparent ones). The package features a USB camera that clips onto the front middle of the gogs, and a 3-axis accelerometer pointer/wand thingy for poking stuff in AR space. The camera assembly also packs some capacity to sense the users position and orientation - no mention of precisely how it does this, though given that they're prepping another addon for the as-yet unreleased (transparent) Wrap 920AV model that features accelerometers for head-tracking, I assume they're using the same hardware on the VR920 AR upgrade.
http://www.gizmotron.co.uk/2009/03/26/vuzix-augmented-reality-accessories-let-you-interact-with-virtual-worlds/



Carl-Zeiss have come to the party with a shiny looking pair of video glasses - the 'Cinemizer'. Technology wise, these basically look to be on-par with the un-upgraded Vuzix VR920's. They're non-transparent with extendable earphone arms and feature a screen similar to a 45" tv at 78" distance. Whatever that means in terms of resolution, they're not saying, though apparently it's really really good. They're going to be available in May for US$499. Though for that price you're probably better off waiting for a few months for Vuzix's Wrap 920AV's - they look better, they're transparent so you won't walk into walls and they can actually do Augmented Reality.
http://www.zeiss.com/C125679B0029303C/ContainerTitel/Cinemizer_EN/$File/index.html



In other news, I found a grinder/hacker/whatever. b.zerk is making his own sensor gloves to go with his Vuzix VR920 + Webcam that he hacked up to run with his OS X laptop. Some cool photos on flickr:
http://www.flickr.com/photos/33676318@N06/3297008280/in/photostream/

Saturday, March 21, 2009

Project Eyeborg is almost complete!

Rob Spence, the Canadian documentary filmmaker who lost his eye in a shooting accident when he was 13 is getting a tiny wireless camera fitted into his prosthetic. It's almost done, they have it working under battery power. The picture of Rob on the site is either incredibly creepy, or incredibly awesome, depending on how much scifi you read as a child. You can guess which way I lean. :)

http://www.eyeborgproject.com/blog/2009/03/19/cyborgs-do-it-on-the-internet/

Monday, March 2, 2009

Face the mirror and say "Skittles" five times and he will come for you...

It seems there's something new and weird happening on Twitter every day now. On Saturday @mpesce bet @stilgherrian $10 he couldn't get 'fisting' into the top trending topics on twitscoop.com, and later onto the topics on search.twitter.com (2 different algorithms! challenging!). Well, yeah, he could - maybe it had something to do with 3 of @stilgherrian's followers having several thousand followers of their own. Anyway, it was fun to follow the #fisting hashtag and see all the people going "wtf? why are so many ppl talking about fisting?!"
And thus was born the term 'trendfisting', to denote gaming twitter for lulz. It may even last.

On Sunday, Rove mentioned Twitter on his TV show as part of some blather over an article claiming that social networks will rot your brainmeats. So a thousand Australians signed up to Twitter and spoke the immortal words "watching rove".. one after another. It was like 'Invasion of the SassyBodysnatcher_69's'. The fisters were not amused - common folk in our geekish clubhouse, ew.

Today skittles.com is showing nothing but the search.twitter.com page, searching for 'skittles'. This may be a super-ironic and genius ploy to get the internets to maintain their website and marketing campaign for them, but I'm leaning more towards weird and slightly creepy, as in "We can see you.. every time you say 'Skittles' we will know...", like Candyman.

Sunday, March 1, 2009

Sunday hax0ring with Twitter

I was getting a little tired of just changing my userpic on Twitter automagically every couple of hours, so decided to make one of my pics a composite image of all my 'friends' there - staring back at them as it were. It turns out it's pretty easy to do with a shell script and their open api.

#!/bin/bash
echo "getting friends list:"
curl http://twitter.com/statuses/friends/cnawan.xml > cnawans-friendlist.xml

echo "trimming friends list:"
grep "/profile_image_url" cnawans-friendlist.xml >
cnawans-friends-pictures.txt
echo "more trimming"
sed -i 's/<profile_image_url>//g' cnawans-friends-pictures.txt
echo "yet more trimming:"
sed -i 's/<\/profile_image_url>//g' cnawans-friends-pictures.txt

echo "getting friends pictures:"
wget --input-file=cnawans-friends-pictures.txt

echo "averaging pictures:"
convert -average *.jpg cnawans-friends-pictures-averaged.jpg

..and set it as my userpic with a cron job like the others:

curl --header "Expect:" -F
'image=@/home/../cnawans-friends-pictures-averaged.jpg' -u
USERNAME:PASSWORD http://twitter.com/account/update_profile_image.xml


As it stands, it only uses the jpgs, not the few pngs that are on the list. I might adapt the script to do them all, but the jpgs tend to be photos more often and make the averaged image look more like a spooky face. :)