Google Glasses complete information
Google got a lot of people thinking when it unveiled Project Glass and its promo video showing how Google glasses of the future could work. It was cool. Plain and simple.
Of course as they often aren’t, things aren’t really that simple. It was a concept video, and few truly know what Google’s glasses in their current state can really do, or even how close Google really is to the currently fictional reality portrayed in the video. Experts in the field of augmented reality have expressed a fair amount of doubt, though there is still plenty of excitement coming from them as well.
AR technology concept broker/analyst Marianne Lindsell tells WebProNews, “Like the ‘Stark HUD’ concept (produced as far I can see as a sort of teaser for the Iron Man II film) I do suspect that the Google Project Glass video has a strong ‘Hollywood’ element.”
“I would guess that Google are both testing the market and managing expectation,” she says. “However also like Stark HUD, there has clearly been some use of technology in the production – and in the case of Project Glass, the tech/Hollywood ratio is I suspect much higher, if less than 100%.”
“The least realistic parts of the Google Glass video clip in my opinion are the field of view (a large FOV is needed – but can such a small device provide it?), the brightness (possibly – but there are some good techs out there), instant responsiveness, and to some extent the (presumption of?) excellent registration (which many AR concepts depend on, but Google have cleverly side-stepped in the clip by largely avoiding such concepts).”
“Focusing at an appropriate distance is possible (I have seen it), – but not in such a tiny piece of hardware (yet!),” she adds. “Even good registration is possible in some situations, – but any specs will be at the mercy of smartphone-like inertial and magnetic sensors (compasses are notorious), unless it can take its cues from the surroundings by image recognition and analysis (which some techs already do surprisingly well).”
Some Things To Consider
Lindsell highlighted some very interesting points about the Google Glasses in a comment she left on another WebProNews article. I’ll repost that comment here:
The glasses -concept- is definitely possible (as is the head tracking). I have seen a number of products that convince me of that – but the sleek designer package probably isn’t (yet).
There are usability thresholds in many areas that such a product will need to meet to be truly useful:
1) Field of View – the Google Glass product seems way to small to provide a useable FOV (no-one is yet aiming high enough here)
2) Brightness – a huge dynamic range is needed, – think about readibility on a sunny day – and brightness takes power
3) Exit pupil – an optical engineering parameter that needs to rate highly or the slightest jiggle of the glasses on your face will rob you of the display
4) Focus – optics will be needed to focus the display at a useable distance
5) Transparency – too opaque and the readouts block out your view (mind that lamp post!) – too transparent and you can’t make out what the marker is saying
6) Zonation and format – you probably -never- want any readouts to appear within your central view area – designing them to appear in the optimum place on the periphery is vital. No large windows please! – prefer conformal indications and markers.
7) Probably more important than all of the above will be the off/standby switch – the default position should be standby – with a quick and easy way to switch ‘on’ while required
Responsiveness and Registration – such a device will be -very- sensitive to delays. A note for OS suppliers!
9) Driving – special case – needs an even more safety-oriented (and accredited) design – but by no means impossible – think HUDs in fast jets
When someone, let’s assume Google for now, first clears all of the above hurdles, then we may have a useable product, although you may not be as keen on it when you see how big the packaging is.
I’m not quick to believe that Google’s sleek, small package is possible. Even then, I am assuming that the device will need to be connected to your smartphone.
Of course it’s always possible that the Google device uses a laser to project the display onto one of the eyepieces. That -might- allow a smaller packaging.
The concept of course remains valid, and the gauntlet is well and truly thrown down to all major players, to overcome the challenges.
As for all the different things such a product would be useful for, – I submit that we have only scratched the surface of AR as a whole.
Who would have imagined the WWW when first connecting two computers together (with due credit to Mr Berners-Lee).
AR is a whole new way of teaming technology with people. For that, the technology needs to be -really- people-friendly!
“Many of these parameters will have a threshold level the tech must achieve in order to be useable and acceptable to the consumer market,” Lindsell told us in an email. “ I am not about to nail my colours to the mast on exactly where to call these levels, but suffice to say that whilst many products out there have some way to go, some of them are, as far as I can see, showing signs that they may get there. This is why I think there may be some real tech behind the Google Glass Project. What we don’t know of course is how far along Google are yet. I think the clue is that it is far enough for them to test the market and attempt to manage expectation.”
So What About Those Contact Lenses?
We recently looked at a fascinating presentation one of the Project Glass engineersgave at Google’s Solve For X event, in which he spoke about using technology similar to Project Glass’ in contact lenses (since then, Google has hinted at the technology for prescription glasses as well). We asked Lindsell if the contact lens approach would eliminate any of the doubt some experts have expressed about the technology Google is using.
“Probably not,” she says. “Of course there are a few universities (and even Microsoft) actively researching electronic display contact lenses, but it is still early days yet. There are significant hurdles in terms of how to power them, and even greater ones in terms of how to focus the image at a suitable distance.”
“Producing a picture matrix with sufficient resolution, over a sufficiently wide FOV is also a major challenge, and although I can’t speak for ‘hidden’ projects, I am not aware that we are even within sight of the right ball park yet (apologies for mixed metaphors),” she continues. “But then there again – electronic focus is possible (I have seen it) – though not in a miniature package. Contact lenses –may- seem like they would help with the FOV and form factor problems, but in reality I think they would have to solve those problems, in miniature, first. I think the jury is out on when contact lenses may be able to deliver AR (though I’m thinking 10 years+), although I might predict that in the interim electronic (non-AR) contact lenses may find use as a health sensor.”
We may not know how much of what has been presented in Project Glass is really feasible at this point, but Google’s promo video has clearly generated a lot of enthusiasm. We asked Lindsell if she expects a lot of excitement and involvement from developers as a result.
“I think this is where Google have really scored,” she says. “People sit up and listen when Google speak. It is my firm hope that they will be able to market an attractive product before this interest dies down. And here’s the rub – truly useable AR specs will require –a lot- of engineering, and this needs funding, which means market interest. There’s a chicken and egg situation here – the market is only interested in what is realistically possible (hence your own interest I suspect), – but even organisations with the ability to fund development need to prove there is a strong demand to release those funds, as well as a sense that the end product is truly feasible.”
“There may be some hope,” she adds. “ I have seen demonstrations of many existing AR specs technologies first hand (including Vuzix, Laster, Trivisio, BAe Systems and a few others) and although I have yet to see a single system meet what I might call a people-friendly acceptability factor, I have seen the current state of development of some of the component technologies.”
“This why I think that AR specs will be possible,” she says. “What I am far less sure about, is the final form factor – but even here let’s not rush to judgement, as prototype devices are certain to be clunky and unpalatable, whereas there has been significant R&D and the final package may be acceptable (even if not quite as tiny as Project Glass). How far Google have really got with this, is anyone’s guess, but if they don’t have something up their sleeve, it would have been very brave of them to put about the Project Glass video clip, with such a tiny device – especially for Sergey Brin to be seen wearing them so openly.”
We wrote about that here, showing some photos photographer Thomas Hawk was able to get:
“If there is a secret here my guess would be laser projection (onto the eyepiece lens, not the retina – which would require eye tracking!) or possibly a cunning use of novel LED tech (there continues to be much R&D here – think blue LEDs and Shuji Nakamura – there was a wonderful Sci Am article about it a couple of years back),” she says. “By the way – that was the one big elephant in the room I forgot to mention in my earlier list – style. Obviously crucial to the market, and for that reason I would take the Oakley announcement very seriously, although I suspect they would do much better to team up.”
“So yes, I think Google have created a lot of interest – and I just hope they can maintain it long enough to release product,” says Lindsell. “Does Apple have something in the works? My guess would be yes – but it would be ultra hush hush, and I doubt if they will declare it until they are ready, in spite of Google’s announcement. Will they be working harder now in the background, – very probably yes.”
There are rumors going around, in fact, that Apple may be working with Valve on such technology.
It may be Google that has generated this wave of excitement related to the possibilities of augmented reality, but there are plenty of others working in the space, and it’s entirely possible that we’ll see even more interesting products coming from elsewhere.
“I see many AR technologies emerging,” Lindsell tells us. “From location-based to marker-based services, image recognition and interpretation, object tracking (now in 3D – see metaio), facial recognition (not just face tracking), zoning, fencing, pre and post-visualisation/transformation, on-the-spot translation, sophisticated auditory cues and environments, use of haptics (early days here – much potential), sensory substitution, crowd sourcing in near real time, and even the use of VR in registration with sensor media to provide context. And there are so many ideas that people have yet to have – so much potential in AR yet to be realised. But there are key enabler technologies required first.”
“One of these is the AR specs,” she continues. “I think we are barely scratching the surface of how we might use AR. I really think that AR is the business end of a generational process of taking IT out of the office and conforming it to the user as ‘wearable tech’ that is constantly available to the user.”
“Think of everything that IT enables us to do now,” Lindsell concludes. “Computing was originally seen as wartime code-breaker technology. The cold war space race then helped it come of age (think chicken and egg again) because we needed help with the complexities of pre-launch checks for the hugely complex moon-rockets. Ever since there has been a march towards ? (no-one knows quite what!). All we know is that is that we use IT as an extension of ourselves – almost like add-on modules to help our brains (and occasionally other parts of us). So the real question is one of human and cultural evolution, what would we like some help with, and how can we increase our reach to get it?”
The technology, although admittedly cool, needs a lot of work, said Todd Haselton, senior editor of mobile for TechnoBuffalo.com. Despite rumors that the glasses will be available by the end of the year and a recent photo of Google co-founder Sergey Brin sporting a pair on PC World's blog, Haselton doesn't think the product is anything as advanced as what Google's video portrays – not yet, anyway.
"Augmented reality still doesn't work well on phones, which are more powerful than those glasses," he said. "I think the technology Google showed in its video is incredibly compelling, but we can't assume it's even close to that level of perfection yet."
But Google Glasses seem to run on the same technology we've been using for years, said Oskari Grönroos, CTO at Quotiply, a mobile phone app development company.
"I think the basic tech inside Google Glasses isn't that pie-in-the-sky," said Grönroos, who after watching the video said the platform looks to be basically made up of a transparent screen, a tiny computer with a specially designed OS, voice recognition, 3G/WiFi connectivity, a tiny camera, possibly eye-movement tracking and a long-lasting battery.
"All of these things we have had for years. Most of them are in your run-of-the-mill, low-end Android smartphone."
The main issues slowing Google down, said Grönroos, are getting the technology into a package that is affordable — a variety of bloggers speculate that the glasses could cost anywhere between $200 and $600 — and making them lightweight enough for consumers to wear.
"Obviously, it would also suffer from the same problems that smartphones do — data connection being too slow, GPS location isn't accurate enough. However, on a basic level, Google Glasses is just current technology in futuristic packaging."
Should retailers care?
Retailers should always be looking over the horizon to determine better ways to engage and interact with their customers, said Brian Ardinger, SVP & CMO ofNanonation, a company known for creating high-tech solutions to help retailers enhance the customer experience.
"While Google Glasses may or may not live up to the vision or hype seen in its promo video, the fact is customer technologies aren't going away. We are in the very early stages of disruption that today's mobile and social tools will have on the retail landscape," he said. "Customers now have the tools and technology to dictate engagement. To survive, retailers will need to learn, experiment and deliver compelling experiences across platforms and technologies."
Google Glasses could facilitate retail experiences in a similar fashion to the way smartphones are currently being used. They have the potential to help retailers make sales by popping up in-store offers, but they could also serve as a way to drive traffic, said Gary Edwards, chief customer officer of Empathica, a customer experience management (CEM) solutions provider. For example, the GPS in them could be used for wayfinding or to alert shoppers to offers when they are near certain stores. They're just another vehicle for location-based marketing.
The glasses could also facilitate a real-time sale, where customers don't even need to be in the store to purchase something, Edwards said.
"This is just an extension of the current transition to couch commerce; (consumers) are using their tablets or phones to shop while watching TV, so now you're at home with your glasses on and can see augmented content on a TV show. It's even easier than with your phone; the interaction with TV would be a killer."
Edwards also believes the glasses could help retailers with internal communication. For example, managers performing check-ins could chat with employees while they both view a virtual checklist instead of doing live audits or sending coaches out to stores.
"You could experience the audit, compare, provide feedback and feature and record with live video streaming," he said. "This takes us a long ways away from clipboards."
On the contrary, Bob Phibbs, the Retail Doctor, doesn't see much of a use for them in the retail space.
"Yes they could be fun, but who exactly would this appeal to? And when marketers get a hold of them they will quickly be discarded like yesterday's 3-D glasses," he said.
cost of google glasses
At Wired, Steven Levy says that Project Glass "is very far from public beta," and that an end of year product launch is "extremely unlikely."
Who is actually going to wear them?
The video recently released by Google featuring "Project Glass," Android-powered eyeglasses with a real-life, heads-up display, has received nearly 14 million views on Youtube. There's no doubt people are interested in the eyeglasses that are supposed allow wearers to talk, text, check in to stores, receive coupons based on location, map routes and take and then share photos on social media — all by giving the commands aloud. Instead of seeing the icons on a mobile phone, the users see them, virtually, in front of their faces.
Google isn't saying when the glasses will launch, exactly how they work or how much they'll cost, but that's not stopping the media from speculating. Tons of blogs and stories featuring predictions about the glasses are already up. (Click here to read our take on how they may affect retailers.) Although I don't see the technology as far-fetched, I do have one question: Who will wear them?
Even if the naysayers are wrong and the technology is solid, Google will have to find a way to get the average consumer to wear the glasses. The company has released photos and videos of attractive models wearing them, but they're still a bit geeky. And I'm not the only person who shares this opinion.
"The geek factor may damper the mass appeal, but weigh that against the use of something like blue tooth," said Tom Nawara, VP of emerging solutions and innovation at Acquity Group, a digital marketing firm. "There used to be stigma there, and people got past it."
I don't know; I still think those headsets are nerdy, but fashion isn't the only problem with their wearability. What about having all that "stuff" floating in your line of sight? It could be distracting or even annoying to some consumers.
Nawara, who speculated that it would take at least a year for the glasses to be available, doesn't think the distraction issue will be much of a problem for consumer adoption. Just look at how people once thought talking on their phones while driving and walking was distracting. They either got used to it or pulled over to talk; they didn't abandon the behavior. The same is true with texting; despite a lot of legislation banning texing while driving, the laws have not been a barrier to mass adoption of texting as a popular form of communication. Nawara expects that Google Glasses could follow the same pattern.
"Humans as a species have evolved and have adapted to more and more sensory input," Nawara said. "Just 20 or 30 years ago no one had cell phone, and we laughed at it like we are now chuckling at Google glasses."