Tuesday, October 30, 2012

Need a hand? Wearable robot arms give you two

IF YOU fancy an extra pair of hands, why not take a leaf out of Dr Octopus's book? A pair of intelligent arms should make almost any job a lot easier.
The semi-autonomous arms extend out in front of the body from the hips and are strapped to a backpack-like harness that holds the control circuitry. The prototype is the handiwork of Federico Parietti and Harry Asada of the Massachusetts Institute of Technology, who suggest that one of the first uses could be to help factory workers, or those with tricky DIY tasks to perform.
"It's the first time I've seen robot arms designed to augment human abilities. It's bold and out of keeping with anything I've ever seen to attach two arms to a human," says Dave Barrett, a roboticist and mechanical engineer at Olin College in Needham, Massachusetts.
So how are the arms controlled? Parietti and Asada designed the limbs to learn and hopefully anticipate what their wearer wants. The idea is that the algorithms in charge of the limbs would first be trained to perform specific tasks.
To demonstrate what the prototype can do, a camera observed a pair of workers helping each other drill into a loose metal plate. The camera measured the distances between the tools and work surface, while feedback from sensors on the workers' bodies tracked their movements. This taught the arms where to grab and how much force to apply, so it could then assist a lone worker to both hold the drill and secure the plate.
If you think the idea of free-roaming robotic arms holding power tools sounds alarming, you aren't alone. "If a robotic arm can do useful work, it can also hurt you badly," says Barrett. "Traditionally, people are kept far away from robot arms because the arms are dangerous. The concept of strapping robotic arms onto a person is terrifying," he says.
Parietti and Asada have tried to address some of those safety fears by building the arms from softer material. Flexible components in the robotic arm, called series elastic actuators - invented in the 1990s by Gill Pratt and Matt Williamson at MIT - mean that less damage will be done if the arms do lose control.
Dennis Hong at Virginia Tech in Blacksburg says that roboticists have spent the last 30 years attempting to make robots more springy and compliant, so they can work safely alongside humans. He says he has never come across robotic arms designed to be worn on the body.
The limbs were described at the Dynamic Systems and Control Conference in Florida last week. Funded by Boeing, their first use could be to help workers build aircraft. The broader goal, say the researchers, is for the limbs and their users to work seamlessly so that "humans may perceive them as part of their own bodies"



Orginaly Posted at www.newscientist.com.

Soap bubble screen is 'the world's thinnest display'

 

         Viewers may soon be able to watch films on soap bubbles - after researchers developed a technology to project images on a screen made of soap film.
An international team produced a display that uses ultrasonic sound waves to alter film's properties and create either a flat or a 3D image.
The bubble mixture is more complex than the one sold in stores for children, but soap is still the main ingredient.
The team says the display is the world's thinnest transparent screen.
"It is common knowledge that the surface of soap bubble is a micro membrane. It allows light to pass through and displays the colour on its structure," the lead researcher, Yoichi Ochiai from the University of Tokyo, wrote in his blog.
"We developed an ultra-thin and flexible BRDF [bidirectional reflectance distribution function, a four-dimensional function defining how light is reflected at an opaque surface] screen using the mixture of two colloidal liquids."
Although traditional screens are opaque, the display created by Dr Ochiai and his colleagues Keisuke Toyoshima from the University of Tsukuba in Japan and Alexis Oyama from the Carnegie Mellon University in the US, varies in transparency and reflectance.
Using sound

Start Quote

This system contributes to open up a new path for display engineering with sharp imageries”
Yoichi Ochiai University of Tokyo
The team managed to control and exploit these properties by hitting the bubble's membrane with ultrasonic sound waves, played through speakers.
Sonic waves alter the texture of a projected image, making it look smooth or rough.
"Typical screens will show every image the same way, but images should have different visual properties," Dr Oyama told the BBC.
"For example, a butterfly's wings should be reflective and a billiard ball should be smooth, and our transparent screen can change the reflection in real time to show different textures."
To change the transparency of the projected image, the scientists modified the wave's frequency.
"Our membrane screen can be controlled using ultrasonic vibrations. Membrane can change its transparency and surface states depending on the scales of ultrasonic waves," wrote Dr Ochiai in his blog.
"The combination of the ultrasonic waves and ultra thin membranes makes more realistic, distinctive, and vivid imageries on screen.
"This system contributes to open up a new path for display engineering with sharp imageries, transparency, BRDF and flexibility."
If several bubble screens are put together, viewers get a 3D effect and even a holographic projection.
The bubble is much harder to burst than a regular soap bubble, as the mixture contains special colloids - and objects can even pass through the film without popping it.
The team said such a screen could be useful for artists to provide a realistic feel to their works, for museums - for instance, to display floating planets, and for magicians as well.
Previously, there have been attempts to develop untraditional displays - a computer screen out of water and a touchscreen out of ice.


Courtsy:BBC News http://www.bbc.co.uk

Megapixel Camera? Try Gigapixel


The camera

DURHAM, N.C. -- By synchronizing 98 tiny cameras in a single device, electrical engineers from Duke University and the University of Arizona have developed a prototype camera that can create images with unprecedented detail.
The camera’s resolution is five times better than 20/20 human vision over a 120 degree horizontal field.
The new camera has the potential to capture up to 50 gigapixels of data, which is 50,000 megapixels. By comparison, most consumer cameras are capable of taking photographs with sizes ranging from 8 to 40 megapixels. Pixels are individual “dots” of data – the higher the number of pixels, the better resolution of the image. 
The researchers believe that within five years, as the electronic components of the cameras become miniaturized and more efficient, the next generation of gigapixel cameras should be available to the general public. Details of the new camera were published online in the journal Nature. The team’s research was supported by the Defense Advanced Research Projects Agency (DARPA).
The camera was developed by a team led by David Brady, Michael J. Fitzpatrick Professor of Electric Engineering at Duke’s Pratt School of Engineering, along with scientists from the University of Arizona, the University of California – San Diego, and Distant Focus Corp.

“Each one of the microcameras captures information from a specific area of the field of view,” Brady said. “A computer processor essentially stitches all this information into a single highly detailed image. In many instances, the camera can capture images of things that photographers cannot see themselves but can then detect when the image is viewed later."
“The development of high-performance and low-cost microcamera optics and components has been the main challenge in our efforts to develop gigapixel cameras,” Brady said. “While novel multiscale lens designs are essential, the primary barrier to ubiquitous high-pixel imaging turns out to be lower power and more compact integrated circuits, not the optics.”
The software that combines the input from the microcameras was developed by an Arizona team led by Michael Gehm, assistant professor of electrical and computer engineering at the University of Arizona.
“Traditionally, one way of making better optics has been to add more glass elements, which increases complexity,” Gehm said. “This isn’t a problem just for imaging experts. Supercomputers face the same problem, with their ever more complicated processors, but at some point the complexity just saturates, and becomes cost-prohibitive."

“Our current approach, instead of making increasingly complex optics, is to come up with a massively parallel array of electronic elements,” Gehm said. “A shared objective lens gathers light and routes it to the microcameras that surround it, just like a network computer hands out pieces to the individual work stations. Each gets a different view and works on their little piece of the problem. We arrange for some overlap, so we don’t miss anything.”
The prototype camera itself is two-and-half feet square and 20 inches deep. Interestingly, only about three percent of the camera is made of the optical elements, while the rest is made of the electronics and processors needed to assemble all the information gathered. Obviously, the researchers said, this is the area where additional work to miniaturize the electronics and increase their processing ability will make the camera more practical for everyday photographers.
“The camera is so large now because of the electronic control boards and the need to add components to keep it from overheating,” Brady said, “As more efficient and compact electronics are developed, the age of hand-held gigapixel photography should follow.”
Co-authors of the Nature report with Brady and Gehm include Steve Feller, Daniel Marks, and David Kittle from Duke; Dathon Golish and Esteban Vera from Arizona; and Ron Stack from Distance Focus.


Originally posted by DUKE 

New NASA Satellites Have Android Smartphones for Brains



   NASA is aiming to launch a line of small satellites called “PhoneSats” that are cheaper to make and easier to build than those it has produced in the past. To achieve this, engineers are using unmodified Android smartphones — in one prototype, HTC’s Nexus One, and in another, Samsung’s Nexus S — to perform many of a satellite’s key functions.
As NASA explains on its website, these off-the-shelf smartphones “offer a wealth of capabilities needed for satellite systems, including fast processors, versatile operating systems, multiple miniature sensors, high-resolution cameras, GPS receivers and several radios.”
“This approach allows engineers to see what capabilities commercial technologies can provide, rather than trying to custom-design technology solutions to meet set requirements,” NASA adds.
The total cost for building one of these prototype satellites costs a mere $3,500. Three are expected to launch aboard the first flight of Orbital Sciences Corporation’s Antares rocket from a NASA flight facility at Wallops Island, Va., later this year.


Originaly posted at Mashables

NASA's Nanosail-D 'Sails' Home -- Mission Complete

After spending more than 240 days "sailing" around the Earth, NASA's NanoSail-D -- a nanosatellite that deployed NASA's first-ever solar sail in low-Earth orbit -- has successfully completed its Earth orbiting mission.

Launched to space Nov. 19, 2010 as a payload on NASA's FASTSAT, a small satellite, NanoSail-D's sail deployed on Jan. 20.

The flight phase of the mission successfully demonstrated a deorbit capability that could potentially be used to bring down decommissioned satellites and space debris by re-entering and totally burning up in the Earth's atmosphere. The team continues to analyze the orbital data to determine how future satellites can use this new technology.

A main objective of the NanoSail-D mission was to demonstrate and test the deorbiting capabilities of a large low mass high surface area sail.

"The NanoSail-D mission produced a wealth of data that will be useful in understanding how these types of passive deorbit devices react to the upper atmosphere," said Joe Casas, FASTSAT project scientist at NASA's Marshall Space Flight Center in Huntsville, Ala.











"The data collected from the mission is being evaluated, said Casas, in conjunction with data from FASTSAT science experiments intended to study and better understand the drag influences of Earth's upper atmosphere on satellite orbital re-entry."

The FASTSAT science experiments are led by NASA's Goddard Space Flight Center in Greenbelt, Md. and sponsored by the Department of Defense Space Experiments Review Board which is supported by the Department of Defense Space Test Program.

Initial assessment indicates NanoSail-D exhibited the predicted cyclical deorbit rate behavior that was only previously theorized by researchers.

"The final rate of descent depended on the nature of solar activity, the density of the atmosphere surrounding NanoSail-D and the angle of the sail to the orbital track," said Dean Alhorn, principal investigator for NanoSail-D at Marshall Space Flight Center. "It is astounding to see how the satellite reacted to the sun's solar pressure. The recent solar flares increased the drag and brought the nanosatellite back home quickly."

NanoSail-D orbited the Earth for 240 days performing well beyond expectations and burned up during reentry to Earth's atmosphere on Sept. 17.

NASA formed a partnership with Spaceweather.com to engage the amateur astronomy community to submit images of the orbiting NanoSail-D solar sail during the flight phase of the mission. NanoSail-D was a very elusive target to spot in the night sky -- at times very bright and other times difficult to see at all. Many ground observations were made over the course of the mission. The imaging challenge concluded with NanoSail-D's deorbit. Winners will be announced in early 2012.

For more information, visit:


http://www.nanosail.org/


The NanoSail-D experiment was managed at the Marshall Center, and designed and built by engineers in Huntsville. Additional design, testing, integration and execution of key spacecraft bus development and deployment support operation activities were conducted by engineers at NASA's Ames Research Center in Moffett Field, Calif. The experiment is the result of a collaborative partnership between NASA; the Department of Defense Space Test Program, and the U.S. Army Space and Missile Defense Command, the Von Braun Center for Science and Innovation, Dynetics Inc. and Mantech Nexolve Corp.

For more information about NanoSail-D visit:

http://www.nasa.gov/mission_pages/smallsats/nanosaild.html
 
 
Janet L. Anderson, 256-544-0034
Marshall Space Flight Center, Huntsville, Ala.
janet.l.anderson@nasa.gov







source:www.nasa.gov 

Google reveals new Nexus devices

 

Two new Android devices, the Nexus 4 handset and Nexus 10 tablet will go on sale on 13 November, Google has announced.
The handset, made by manufacturer LG, can also work as a games controller when wirelessly connected to a TV.
Google said that the screen of the new Nexus 10 tablet by Samsung has "the world's highest resolution display".
UK prices for the Nexus 4 start at £239 for the 8GB version, and £319 for the 16GB Nexus 10.
The Nexus 4 smartphone has a new panoramic camera tool called Photo Sphere which Google claims is "unlike any panorama you have ever seen".
"Snap shots up, down and in every direction to create stunning 360-degree immersive experiences," says the firm on its blog.

Analysis

The battle lines are now drawn for the all-important Christmas shopping season.
Apple's iPad may have dominated tablet sales to this point, but it now faces cheaper competitors with their own media ecosystems and - in the case of Samsung's new Nexus 10 - a higher-resolution screen.
Purchase decisions may come down to brand loyalty: does a shopper identify with an Amazon, Apple, Google or Microsoft logo on the back of their device? This is still a very young market and the key players are essentially still in land grab mode.
The one thing for certain is those involved feel under pressure to innovate more quickly and lower their margins - or in some cases sell the hardware at break-even prices - all of which can only be good for the public.
In addition users will have access to "Google Now" which flags up flight alerts, hotel recommendations and package tracking based on the user's location and the contents of their previous email messages.
Nexus 10 will be able to manage multiple profiles, meaning that the tablet can be shared between more than one person.
Google says that there are more than 675,000 apps that will be available via its Google Play store. Apple claims that there are over 275,000 apps available for the latest version of iPad.
Nexus 10 has a 10in (25cm) screen with a resolution of 300ppi (pixels per inch). Its advertised battery life is 500 hours on standby or nine hours of video play.
The firm also announced that it will add music to its Google Play media store in the UK, Germany, France, Italy and Spain on the same day.
New users will be able to upload 20,000 of their existing music tracks to their online accounts for free, and then stream music via the Cloud to any Android device connected to the internet.
Google also announced a refresh of its existing tablet the Nexus 7, which will now offer 3G connectivity on a top-end model. However, all versions of the Nexus 10 are wi-fi only.
None of the new hardware will be made by Motorola.
Google purchased Motorola Mobility, the mobile phone manufacturing arm of the firm, in 2011 for $12.5bn (£7.7bn) but a spokesperson said that the company was not receiving any "special treatment".
The announcement was made after Google scrapped plans for a high-profile event to mark the launches in New York.
The event was cancelled because of the approach of Hurricane Sandy.

Source:http://www.bbc.com

Drones set to share sky with domestic air traffic

 

Tests have been carried out to see whether military drones can mix safely in the air with passenger planes.
The tests involved a Predator B drone fitted with radio location systems found on domestic aircraft that help them spot and avoid other planes.
The tests will help to pave the way for greater use of drones in America's domestic airspace.
The flight tests took place off the coast of Florida in early August, but details have only just been released.
The Predator B used in the tests is a modified version of the Guardian drone typically used by the US navy. While such robot planes have been widely used in war zones and on military operations, their use over native soil has been restricted.
Politicians have given the Federal Aviation Authority (FAA) until 2015 to prepare its air traffic systems for the use of drones, both commercial and military, over US territory.
For the tests it was fitted with a location system known as Automatic Dependent Surveillance-Broadcast (ADS-B) that the FAA wants all domestic aircraft to use by 2020.
Once widely used, ADS-B will change America's air controls from a ground-based system to one that takes flight position data from satellites. By switching to this, the FAA hopes to simplify the job of managing air traffic and improve safety.
The drone completed its trials successfully, said a statement from drone maker General Atomics. The drone's location and flight path were precisely monitored throughout its flight, said the defence firm, and suggests such craft can "fly cooperatively and safely" in domestic US airspace.
More tests are planned.

Source:BBC News

Twitter Delicious Facebook Digg Stumbleupon Favorites More

 
Design by Free WordPress Themes | Bloggerized by Lasantha - Premium Blogger Themes | coupon codes