Android Weekly: Top 8 Android stories this week

Here comes the L release

android l release apis
At Google I/O, the company gave us a preview of the features and design coming to Android devices this fall through the new L release. From a new design language called Material, to building blocks for creating a consistent experience across platforms, and cool new features, L release is the biggest Android update in years. Plus, we get to try out all the goodies right now, thanks to beta versions available for the Nexus 5 and Nexus 7 (2013).

Android Wear

LG G Watch unboxing initial setup (10 of 13)
Android Wear was no secret, but this week we saw the full release of Google’s smartwatch centric operating system. The first devices to be powered by AW are the LG G Watch and Samsung Gear Live, with the round Moto 360 expected later this summer. Asus too is rumored to release an Android Wear watch, due in September for $150!

Android One: Google’s play at the low end

android one
Flagships may be flashy, but the majority of Android devices are at the low end of roster. Google wants to improve the quality of entry-level smartphones with Android One, a program that will bring decent devices for $100 or less to developing markets, starting with India.

Android on the road

android auto first look (15 of 18)
Using Android hands free was one of the big themes of Google I/O this year, and nowhere is that more important than behind the wheel. Android TV attempts to bring the power of Android to the dashboard, without compromising on safety and usability.

Android TV – does Google finally get TV?

android tv first look (1 of 10)
From your phone, to your wrist, to your car, and now your living room. Android TV is not Google’s first attempt at conquering the living room, but it’s the most promising yet. With a sleek interface adapted to the big screen and built-in gaming capabilities, Android TV could turn out to be a real success.

Chrome OS and Chromecast news

chromecast-living-room-tv
Google’s other big platform is Chrome, and Google I/O brought some cool news on this front as well, including the ability to run Android apps on Chrome OS and screen mirroring on the Chromecast.

The future of Nexus

nexus 7 2013 vs nexus 7 2012 aa 4
The Nexus program isn’t going anywhere, a top Android executive said this week, putting an end to months of rumors and speculation. So, when can we expect a new Nexus device? According to one report, it may be this fall, when HTC’s Volantis tablet is due, possibly alongside the full launch of the L release.

On the horizon: Note 4

Samsung Galaxy Note 3 jet black S pen stylus aa 8
The next big device on the horizon is Samsung’s Galaxy Note 4. Rumors claim Samsung is speeding up its production cycle, with the device scheduled to be available as early as the beginning of September.

Project Tango

What is Project Tango?

As we walk through our daily lives, we use visual cues to navigate and understand the world around us. We observe the size and shape of objects and rooms, and we learn their position and layout almost effortlessly over time. This awareness of space and motion is fundamental to the way we interact with our environment and each other. We are physical beings that live in a 3D world. Yet, our mobile devices assume that physical world ends at the boundaries of the screen.
The goal of Project Tango is to give mobile devices a human-scale understanding of space and motion.
Our team has been working with universities, research labs, and industrial partners spanning nine countries around the world to build on the last decade of research in robotics and computer vision, concentrating that technology into a unique mobile device. We are putting early prototypes into the hands of developers that can imagine the possibilities and help bring those ideas into reality.
We hope you will take this journey with us. We believe it will be one worth traveling.
- Johnny Lee and the ATAP-Project Tango Team
Project Tango

3D motion and depth sensing

Project Tango devices contain customized hardware and software designed to track the full 3D motion of the device, while simultaneously creating a map of the environment. These sensors allow the device to make over a quarter million 3D measurements every second, updating its position and orientation in real-time, combining that data into a single 3D model of the space around you.
They run Android and include development APIs to provide position, orientation, and depth data to standard Android applications written in Java, C/C++, as well as the Unity Game Engine. These early prototypes, algorithms, and APIs are still in active development. So, these experimental devices are intended only for the adventurous and are not a final shipping product.

What could I do with it?

What if you could capture the dimensions of your home simply by walking around with your phone before you went furniture shopping? What if directions to a new location didn’t stop at the street address? What if you never again found yourself lost in a new building? What if the visually-impaired could navigate unassisted in unfamiliar indoor places? What if you could search for a product and see where the exact shelf is located in a super-store?
Imagine playing hide-and-seek in your house with your favorite game character, or transforming the hallways into a tree-lined path. Imagine competing against a friend for control over territories in your home with your own miniature army, or hiding secret virtual treasures in physical places around the world?

                                                                                                  





                                                                                                                    Project Tango Tablet Development Kit
                                                            Our 7'' development kit is powered by the new NIVIDA 
                                                                                  Tegra K1 processor packed withe 4GB of RAM and                                                                                             128GB of storage,motion tracking camera ,intefrated depth                                                                                    sensing,Wifi,BTLE,and 4G LTE 
                                                                                    
                                                                                   These development kits are designed for professional                                                                                            developers interested in exploring the future of mbile 3D                                                                                      sensing. Developers will receive updates as the software
                                                                                   algorithms and APIs evolve.These development kits are not                                                                              not a consumer device and will be available in limite.
                                                                                                                         


                                                                                                                           For more info for the Tablet go to the                                                                         Google  page >  Project Tango Table
                                                       

Project Ara



                                          Project Ara
          Project Ara is the codename for an initiative by Google that aims to develop a free, open hardware platform for creating highly modular smartphones. The platform will include a structural frame that holds smartphone modules of the owner's choice, such as a display, keyboard or an extra battery. It would allow users to swap out malfunctioning modules or upgrade individual modules as innovations emerge, providing longer lifetime cycles for the handset, and potentially reducing electronic waste.[3][4] The first model of the modular phone is scheduled to be released in January 2015 and is expected to cost around $50.






A plethora of news has come out of Google GOOGL +0.19%’s I/O conference this week, most of it revolving around Android and the company’s plan to use its popular mobile OS virtually everywhere. Representatives from Google GOOGL +0.19% talked about putting various customized versions of Android in automobiles, wearable devices, larger mobile devices, and of course more smartphones. Google even showed of a funky, super-cheap virtual reality headset made of cardboard at I/O. The cardboard pieces can be quickly opened up and folded into a structure that holds a smartphone, and the cardboard structure essentially splits the smartphone’s screen into left and right displays—one for each eye. Google aptly named the project ‘Cardboard’, and while it may seem silly at first, it’s actually quite ingenious.  You can read more about cardboard, here.
For a hardware junkie like me, one of the most interesting bits of information to come out of the event wasn’t even mentioned during the initial keynote and was ultimately somewhat of a failure.  During a live, on-stage demo in one of I/O’s smaller breakout sessions, members of Google’s Advanced Technology and Projects (ATAP) group, including Paul Eremenko, powered up a functional Project Ara device, which then began to boot Android. Eremenko is the current head of Project Ara at Google.
Google Project Ara Modules
Google Project Ara module mock ups.
If you’re not familiar with Project Ara, it is Google’s initiative (which was originally started at Motorola Mobility) to build fully modular, open hardware platform for smartphones. If all goes to plan, there will be three initial Project Ara frames (Mini, Medium, and Large) that will accept various modules for things like the CPU, memory, camera, and network controllers. Users of Project Ara devices (assuming they ever make it to market) could essentially configure and upgrade their smartphones at will, similar to the desktop PCs of today. The idea is that if a Project Ara user wants a faster processor or better camera, he or she could simply swap out that module, without having to replace the entire smartphone. The project is ambitious and has garnered plenty of justified skepticism to this point, partly because there have been no public demos of working prototypes—until now.
During the demo, Eremenko and crew are up on stage, while a cameraman points his camera at the Project Ara prototype. The feed from the camera is projected on a large screen, so the attendees in the audience can see what’s happening. The prototype phone is powered up, and shortly thereafter an Android loading screen is displayed. A few moments after that, a portion of the clock on the lock screen appears on the device’s display and then the demo is concluded, amid cheers from the audience.
Google Project Ara Prototype
The Project Ara prototype device shown at Google I/O.
It’s not clear if the Project Ara device used in the demo hung or if its failure to display the entire lock screen was a simple graphics rendering issue, but it doesn’t really matter. The fact that a semi-working prototype was shown to the public is a major step forward.
A lot still has to happen before Project Ara can bear fruit. Android doesn’t have support for modularity just yet (that should be coming in a few months), hardware partners have to start making viable modules, and consumers need to be educated about the platform. Regardless, it seems Google plans to continue development of the platform, and who knows what the future holds. If Project Ara is successful and the idea of a modular smartphone takes hold (a BIG if), it could be a game changer.

LG G3′s laser auto focus

The newly announced LG G3 is a feature packed flagship handset, offering up a range of cutting edge technologies that give other flagships a run for their money. Although the QHD display might be the talk of the town, there’s another really interesting piece of technology tucked away in the LG G3′s camera – laser auto focus.
As the name implies, the LG G3 make use of a laser sensor system to adjust the focus of the handset’s rear camera, rather than say phase detection focusing used in the Galaxy S5 and more expensive DSLR cameras, or the popular contrast detection method. The obvious question is then, is it better? How does it work?

How it works

Whilst laser technology sounds a tad gimmicky and a little on the sci-fi side, the use of laser technologies for calculating distance is nothing new. The idea has been used in a number of industries and products for a many years, including rangefinders and even in some compact cameras.
The familiar principle remains the same when it comes to the LG G3. There’s a small laser transmitter located on the back of the handset near the camera sensor, which is where most of the work takes place. The first step involves firing out a short laser light burst, which is then reflected back off whatever you happen to be pointing the camera at. This light then travels back towards the sensor, where the software calculates the time it takes for the light to leave and return, resulting in a very accurate measurement of how far away the target object is.
LG G3 camera
The beam that is emitted is extremely thin, which manes that there’s a lower chance of multiple returns caused by reflections and refractions. However, such a technique doesn’t always produce the desire accuracy and results, especially at longer distances and in more open environments. As you can imagine, the size of the receiving sensor is quite compact in a smartphone, and therefore the angle for error is also relatively small.
To counteract this issue, LG’s camera operates as more of a hybrid device, making use of either laser or contrast detection methods when required. LG compensates for poor laser returns, reflective surfaces and detection of transparent surfaces, which the laser would pass through, with contrast detection. Contrast detection uses the main image sensor, sweeping upwards through increasing levels of contrast to find the difference between adjacent pixels. In situations where contrast detection is used, LG’s laser system allows the auto focus algorithm to automatically skip the first two feet of distance, which helps to speed up the process. However, there’s no actual depth calculations with the contrast method, which makes it more difficult to track moving objects.
Laser range finder
Distance = (speed of light X time)/2. Image source: IITK
By using this hybrid system, the LG G3 is able to very quickly and accurately detect the focal distance of closer objects. LG claims almost instantaneous detection of objects within two feet of the camera. The LG G3 takes around just 276 milliseconds to focus an image, which is especially important when trying to focus on moving objects. The laser also operates perfectly in low light conditions, whereas contrast detection would struggle to tell the difference between multiple dark pixels.

Does it make a difference?

The biggest benefits to laser assisted focus comes from increased speed and accuracy when focusing, especially at short distances where the beam’s bounce back is more predictable. Consistent performance in darkened environments is also a big benefit, especially when compared with contrast based focus. However, lasers aren’t necessarily reliable in every environment.
Just like contrast detection, phase detection is another passive way of detecting the optimum focal point. It doesn’t instantly detect focal distance through mathematical calculations, but instead relies on a little bit of trial and error to correct the sensor’s focus. For the sake of comparison, briefly, phase detection uses a beam splitter to separate incoming light into multiple images, which can then be compared, in terms of light intensity, to determine whether or not the image is in focus. The Galaxy S5 uses a contrast/phase hybrid auto focus, whilst most other handsets rely solely on contrast detection.
Phase Detection Autofocus
Phase Detect Sensor, for comparison. (1) too near, (2) correct, (3) too far and (4) way too far. Source: PhotographyLife
Again, the only real advantage laser sensors have over phase detection is the speed at which focus can be achieved, in certain scenarios. It’s tough to say which of these two is best overall, as different shot types will likely favour slightly different approaches. However, the combination of laser and contrast detection is bound to be better than a contrast only setup, giving the LG G3 an advantage over most other smartphones. Of course, real world tests will be the best judge.
lg g3 hands on (19 of 31)
By developing a hybrid system, LG aims to take the best of both worlds, and on paper it sounds like an impressive and useful piece of technology. We’ll have plenty more hands on time with the LG G3’s camera when it comes to our full review, where we’ll definitely be putting the laser sensor through its paces.