Chances are over the last 30 or so years, you’ve sat at a desk typing on a keyboard in front of a monitor, and clicking around with a mouse to interface with technology. Oh sure mobile devices have been around for a while but only over the past 5 or so years has it been even close to an enjoyable experience. Today we have multi-touch devices and gaming systems like Nintendo Wii, but it’s what’s on the horizon that will change how we look at the user interface in a whole different way.
There’s not a lot more I can say about BumpTop that isn’t already talked about in the video except that I have some hands on experience. I have used it for about 2 weeks now on both my desktop Mac and PowerBook. The learning curve isn’t as straight forward as they make it seem in the video, but it was easy enough to get user to in a week or so. My desktops (both real and virtual) are alway a mess. I have Stacks and Spaces to help me get organized, but BumpTop is more natural, and it really is what Spaces and Stacks should have been.
BumpTop’s gestures (Pro version only) work with your laptops multi-touch touchpad, but also works with a Wacom Bamboo Touch so you desktop users aren’t left out.
Yes, you’re still sitting in front of a screen but BumpTop really give you a sense of reality. The physics are superb and stacking docs in piles by random grouping or piles by type, help keep you organized.
Other notable alternative computer desktops which have been around for some time and most look at the desktop in 3D from the outside.
They have the “fun factor”, but don’t have the natural usability of BumpTop. It is because of this that they will remain a novelty at best. Don’t get me wrong, they are all forward thinking alternatives to the flat 2D experience of the last 30 or so years, but if we are truly going to change the game, even BumpTop needs to start looking further ahead.
I stil vividly remember my father taking me and my older brother to go see Star Wars and the scene where C3-PO is about to beat Chewy at a game of HoloChess. I wanted one of those so bad I could taste it! For the next, oh, 31 years I would be destined to compare every new technology to something I saw in Star Wars and we’re finally seeing a lot of what was science fiction becoming science FACT! In the next couple examples we’ll take a look at what we can expect to start seeing.
This team from CMU developed a D&D game using Microsoft’s Surface that makes me feel like a kid again!
Do a search on YouTube (or your video site of choice) for Microsoft Surface and you’ll find a plethora of examples of how this technology is being used. From entertainment to retail this the Surface UI will most definitely change our lives.
advanced touch-manipulated user interface
This video from TED in 2007, Jeff Han demonstrates another surface UI. By watching the video you can see how this surface differs from Microsoft’s Surface in that it has pressure sensitivity, and is nt only sitting at an angle (which may be for the purpose of the demonstration only) but it also appears to be a transparent interface. At certain parts of the video you can see the underside the surface and still see the UI, ala Avatar and Minority Report.
3D (2½D) Immersive Interfaces
Utilizing the theory of electrostatics, we have designed a low-cost human-computer interface device that has the ability to track the position of a user’s hand in three dimensions. Physical contact is not required and the user does not need to hold a controller or attach markers to their body. To control the device, the user simply waves their hand above it in the air.
Gaming consoles like Nintendo Wii use infrared sensors that detect position and an accelerometer for speed and rotation, but are tethered by a physical controller. Yes, it’s wireless, but you still have to hold something to control actions. Looking ahead, these limits need to be lifted.
Microsoft’s Project Natal
By now I’m sure most of you have heard about Microsoft’s answer to the Wii, Project Natal. It isn’t the true controller-free gaming/navigation that has my interests peaked, but the camera with facial recognition and the ecommerce, or as I like to call it, vCommerce (virtual) possibilities that has my brain sparking, but we’ll get more in to that later.
In the video you’ll see at about two-thirds the way through the video there is a short segment where a woman has a personal shopper recommending a dress for her to wear. She virtually takes the dress off the rack, drops it on her virtual self and is able to turn from side to side, seeing how this dress will fit from any angle. How well this works, still remains to be seen, but my guess is soon we will be shopping for clothes online, not in front of our computers, but in front of our televisions. Not with a retarded looking 3D model but by looking at ourselves via video in the comfort of our own homes. FINALLY!! I can try on women’s clothes without getting chased out of the store! (did I say that out loud?)
So far, all the interfaces we’ve looked at have at least one common thread. The fact that all of them require us to be where they are located and are then confined to that space until we decide to stop using them. Which brings me to the last of our look at alternative interfaces.
Augmented Reality is here and it has the most potential for becoming an everyday part of our lives. We will wonder how we ever lived without it.
Mobile devices will be the driving factor with this new technology. Because of its ability to free us from the tether of having to be where the user interface is, sets it apart from all the rest.
Apps like Acrossair’s, “Nearest Tube” help us in our everyday lives by using something we always have with us. Our phones. There are already a few apps using augmented reality, and we’re sure to see more and more.
Consider this possibility.
You’re in (major metropolitan area) and you’re looking for the nearest (big chain department store). You launch your Augmented Reality app from your phone, enter your search term and hold up the phone. Using the phones camera, just like Nearest Tube, you get an image overlaid on top your camera view with arrows pointing you in the right direction and distance to said location.
In your preferences you have already indicated that if the distance exceeds 2 miles that the app will search for local taxi services with a 3-star rating or higher. You are in fact 4 miles from your destination, there are no taxis in sight, so you tell the app to send your location to Acme Cab Co. and confirmation is sent to you saying a cab will be there to pick you up in 5 minutes.
You arrive at your department store. Browsing around the new Spring fashions you see a end-cap display with a recognizable symbol. Open your Augmented Reality app again, point it at the symbol and a video starts playing. In this video a virtual stylist is telling you how to accessorize the garment you’re looking at, or telling you more about the products designer. Or perhaps it’s a Beyoncé’s new Dereon line and you get to watch the new Beyoncé video.
Let’s take it one step further. Using the Microsoft Natal garment idea. You have already created your virtual model and this time you point the camera at the bar code on the garment. Now you’re looking at your virtual self wearing the garment. Moving the model around from side to side you realize it must have just been the color because it looks horrible on.
From desktop alternatives to location-based augmented reality, these are the types of user interfaces we have now and can look forward to. The technology is already here. It’s up to us to think of new ways to use it.
What are your favorites? Did we miss any that are even better? Let us know!