So, it turns out[1] that five cities have finally been selected to participate in the Code For America[2] program for 2011: Boston, Boulder, Philadelphia, Washington, D.C., and Seattle. Each of these cities will identify the need for some city government project that can leverage modern web technologies, be assigned a group of five developers[3], and will work with them to develop that project over 11 months, starting in January.
I am on pins and needles to see what comes out of this. I'm also curious how the city governments plan to pick the projects.
[1] "Five Cities Get Free Civic Apps Through Code for America". Digital Philadelphia. http://digitalphiladelphia.wordpress.com/2010/05/07/five-cities-get-free-civic-apps-through-code-for-america/
[2] "About". Code For America. http://codeforamerica.org/about/
[3] Developers apply for the opportunity. The application will be available June 1, and the deadline is August 1. "For Developers". Code For America. http://codeforamerica.org/for-developers/
"kwa watu" means "for people". I'm interested in software tools that help people live healthy, purposeful, fully human lives, and in how technology shapes the consensus on what that means.
Monday, May 10, 2010
Wednesday, March 17, 2010
Augmented (Hyper)Reality
Over on GOOD.is, there's a video posted that demonstrates Keiichi Matsuda's conception of what augmented reality might look like. As we've seen in some previous posts, this type of reality may not be [as] far off [as some would like].
But what's with all the ads? Adds a touch of realism, I suppose.
What Augmented Reality Could Actually Look Like
http://www.good.is/post/what-augmented-reality-could-actually-look-like/
But what's with all the ads? Adds a touch of realism, I suppose.
Augmented (hyper)Reality: Domestic Robocop from Keiichi Matsuda on Vimeo.
What Augmented Reality Could Actually Look Like
http://www.good.is/post/what-augmented-reality-could-actually-look-like/
Wednesday, March 10, 2010
Recognizr: An Augmented ID Concept
A few posts ago[1], I referenced the Sixth Sense TED presentation. Here's[2] another technology along the same lines. It's a prototype video for an Android app that retrieves information on a person using facial recognition. They call it Recognizr, an "augmented ID" concept.
Also, for those eye-tracking augmented reality glasses that I mentioned in the other post...the eye-tracker just seems like a slight modification of this[3].
[1] "SixthSense" from MIT Media Lab
http://kwawatu.blogspot.com/2010/02/sixthsense-from-mit-media-lab.html
[2] Recognizr
href="http://www.youtube.com/user/TATMobileUI#p/u/0/5GqJHaNRla
[3] Student learns to control computer with a blink of an eye
http://www.rit.edu/news/?v=46626
And make sure to check out the WSJ video:
Andy Jordan's Tech Diary: EyeTech Quick Glance
http://online.wsj.com/video/andy-jordans-tech-diary-eyetech-quick-glance/6B9D2F61-C8FE-41F8-BA10-4F2DFB85355D.html
Also, for those eye-tracking augmented reality glasses that I mentioned in the other post...the eye-tracker just seems like a slight modification of this[3].
[1] "SixthSense" from MIT Media Lab
http://kwawatu.blogspot.com/2010/02/sixthsense-from-mit-media-lab.html
[2] Recognizr
href="http://www.youtube.com/user/TATMobileUI#p/u/0/5GqJHaNRla
[3] Student learns to control computer with a blink of an eye
http://www.rit.edu/news/?v=46626
And make sure to check out the WSJ video:
Andy Jordan's Tech Diary: EyeTech Quick Glance
http://online.wsj.com/video/andy-jordans-tech-diary-eyetech-quick-glance/6B9D2F61-C8FE-41F8-BA10-4F2DFB85355D.html
Labels:
augmented reality,
human-computer interaction,
video
Wednesday, March 3, 2010
Computers shouldn't make people feel like idiots
For those of us surrounded by the minutiae of computers all day, it’s easy to forget there’s a world of people out there who just don’t get it. And it’s not their fault. It’s ours.
Interesting article over on the 37 Signals design and usability blog. Some meta-analysis regarding the iPad. I really like this quote from Fraser Speirs:
The Real Work is not formatting the margins, installing the printer driver, uploading the document, finishing the PowerPoint slides, running the software update or reinstalling the OS.
The Real Work is teaching the child, healing the patient, selling the house, logging the road defects, fixing the car at the roadside, capturing the table’s order, designing the house and organising the party.Fraser Speirs, Future Shock [2]
[1] http://37signals.com/svn/posts/2132-computers-shouldnt-make-people-feel-like-idiots -- Computers shouldn't make people feel like idiots
[2] http://speirs.org/blog/2010/1/29/future-shock.html -- Future Shock
[2] http://speirs.org/blog/2010/1/29/future-shock.html -- Future Shock
Wednesday, February 24, 2010
[Socially] Situating Personal Information Management
PIM practices become easier if [an] organization provides some infrastructure to alleviate the difficulty of these activities. But a larger value is that the organization can leverage these personal practices to improve the effectiveness of others and to capture that elusive corporate knowledge in an easy way.
Thought provoking. Give the video a watch.
[1] Situating Personal Information Management - http://www.youtube.com/user/googletechtalks#p/u/1/eA9NT4b6UNA
Wednesday, February 17, 2010
OOPSLA is changing. OOPSLA is becoming SPLASH.
The conference for Object-Oriented Programming, Systems, Languages, and Applications (OOPSLA) has a new name and overall mission. It's now Systems, Programming, Languages, and Applications: Software for Humanity (SPLASH). I'm liking the name.
Seems like they're attempting to reconcile the inclusion of the Onward! track of the OOPSLA conference. I approve.
For important dates and information, see:
http://splashcon.org/
http://onward-conference.org/
Seems like they're attempting to reconcile the inclusion of the Onward! track of the OOPSLA conference. I approve.
In 2002, Onward! was created as a special track within OOPSLA to be a venue for bigger ideas than normally are accepted by mainstream computer science conferences, but within the scope of OOPSLA’s focus. "Bigger ideas" included new approaches to programming, software, and software development; new paradigms; and even new ways to present ideas.
Beginning in 2003, Onward! papers were included in the OOPSLA proceedings, and in 2005, Essays and films were added to Onward!. As the track grew, it became clear that there was a need for Onward! in a larger context than object-oriented programming, and in 2009, Onward! spun off from OOPSLA to become a stand-alone conference focusing more broadly on software and programming in all their manifestations, and including not just the pure technology but also processes, methods, and philosophy.
From 2010, we plan that Onward! will be co-located with SPLASH (the evolution of OOPSLA), but in the future, the sky’s the limit.
- Onward! History
For important dates and information, see:
http://splashcon.org/
http://onward-conference.org/
Wednesday, February 10, 2010
"SixthSense" from MIT Media Lab
Pattie Maes and one of her students, Pranav Mistry, demonstrated a system they've been working on to "augment" a user's experience of the world by delivering relevant information about certain objects, as well as allowing the user to interact with that information.
Pattie Maes and Pranav Mistry demo SixthSense
http://www.ted.com/talks/pattie_maes_demos_the_sixth_sense.html
I've imagined something similar in the form of glasses that record your eye movements and cross-reference that data with recorded images of what's in front of you with to determine points of focus. Then, theoretically, they could display information about whatever you're focusing on onto the glass of the spectacles. Pattie Maes takes it in a slightly different direction when, at around 08:30 in the video, she says, "who knows, maybe in another 10 years we'll be here with the ultimate 6th sense brain implant."
Regardless of the interface (fingers, eyes, brain, etc.), is this something that would be good for humans?
Pattie Maes and Pranav Mistry demo SixthSense
http://www.ted.com/talks/pattie_maes_demos_the_sixth_sense.html
I've imagined something similar in the form of glasses that record your eye movements and cross-reference that data with recorded images of what's in front of you with to determine points of focus. Then, theoretically, they could display information about whatever you're focusing on onto the glass of the spectacles. Pattie Maes takes it in a slightly different direction when, at around 08:30 in the video, she says, "who knows, maybe in another 10 years we'll be here with the ultimate 6th sense brain implant."
Regardless of the interface (fingers, eyes, brain, etc.), is this something that would be good for humans?
Subscribe to:
Posts (Atom)