CLICK HERE FOR THOUSANDS OF FREE BLOGGER TEMPLATES »

Thursday, May 22, 2008

ROBOTICS - Ant navigation

Next time you find yourself lost despite having a map and satellite navigation, spare a thought for the unfortunate ant that must take regular trips home to avoid losing its way. Dr Markus Knaden, from the University of Zurich, will report that a visit back to the nest is essential for ants to reset their navigation equipment and avoid getting lost on foraging trips. "Knowledge about path integration and landmark learning gained from our experiments with ants has already been incorporated in autonomous robots. Including a 'reset' of the path integrator at a significant position could make the orientation of the robot even more reliable", says Dr Knaden who will speak on Tuesday 4th April at the Society for Experimental Biology's Main Annual Meeting in Canterbury, Kent [session A4]

Ants that return from foraging journeys can use landmarks to find their way home, but in addition they have an internal backup system that allows them to create straight shortcuts back to the nest even when the outbound part of the forage run was very winding. This backup system is called the 'path integrator' and constantly reassesses the ant's position using an internal compass and measure of distance travelled. Knaden and his colleagues hypothesised that because the path integrator is a function of the ant's brain, it is prone to accumulate mistakes with time. That is, unless it is regularly reset to the original error-free template; which is exactly what the researchers have found.

When they moved ants from a feeder back to a position either within the nest or next to the nest, they found that only those ants that were placed in the nest were able to set off again in the right direction to the feeder. Those left outside the nest set off in a feeder-to-home direction (i.e. away from the nest in completely the opposite direction to the source of food) as if they still had the idea of 'heading home' in their brains. "We think that it must be the specific behaviour of entering the nest and releasing the food crumb that is necessary to reset the path integrator", says Knaden. "We have designed artificial nests where we can observe the ants after they return from their foraging trips in order to test this."

What next? The group plan to study other ant species that live in landmark rich areas. "Maybe we will find that such ants rate landmarks more highly and use them, not the nest, to reset the path integrator", explains Knaden.

Exo marsrover concept

One of the attractions at the ILA2006 Space Pavilion is the full-scale ExoMars rover mock-up based on an artist's impression of Europe's next mission to Mars and the first robotic mission with the European Space Exploration Programme Aurora.
The large rover and its deployment on the surface of Mars are probably the most challenging elements of the ExoMars mission, currently slated for launch in 2011, which will search for traces of life on and underneath the surface of Mars. The rover will carry a payload, dubbed Pasteur, and will be equipped with a drilling system that will reach up to two metres below the Martian surface. Through the mock-up and accompanying background animation the many visitors to ILA2006 could gain an appreciation of the different mission phases, the rover surface operations, as well as the rover's expected size.

While an artist's view was used to produce both the rover and its animated graphics, European industry is gearing up to design and manufacture the real thing after having conducted conceptual studies (Phase A) both for the mission and for the rover as one of the mission elements.

The ExoMars mission, under the prime contractorship of Alcatel Alenia Space in Turin, is currently in its preliminary design (B1) phase. The design and manufacturing of major mission elements, such the Carrier, the Descent module and, of course, the Rover will be awarded, in compliance with ESA procurement rules, to European and Canadian companies through devoted Invitation to Tender that will be issued in the coming weeks by the prime contractor, Alcatel Alenia Space.

ExoMars is one of the components of ESA's Aurora Space Exploration Programme, which is an optional programme under the remit of the Directorate of Human Spaceflight, Microgravity and Exploration. Italy is the country with the largest contribution to the ExoMars mission, among the 14 countries supporting the mission, followed by the Great Britain, France and Germany.

Contract allocation will reflect industrial expertise as well as the level of national contribution - a subject which has fuelled intense discussions among Participating States. No final decision however has been taken with respect to which element will be awarded to whom, even though, given the level of contribution, the available expertise and the consensus among participants, certain decisions can be reasonably anticipated especially at system design and integration level. The design and building of actual hardware is mostly open to competition among industry of all Participating States.

European industry is readying itself for the upcoming challenge to design and build a rover. This activity is a good sign of Europe's commitment and confidence to be able to land on Mars at the beginning of the next decade and further our knowledge of the Red Planet especially with respect to its biological features.

The final configuration of the ExoMars rover, and of the other elements, remains today, however, still to be defined in particular as far as the contribution of each country and its industries is concerned.

A presentation to interested European and Canadian industry is being organised at ESA's research and technology centre, ESTEC, in Noordwijk, the Netherlands, on 1 June 2006 with the aim of presenting all the available opportunities for industry to be involved in taking Europe on and beneath the surface of Mars.

I robot, you companion

The concept of a cognitive robotic companion inspires some of the best science fiction but one day may be science fact following the work of the four-year COGNIRON project funded since January 2004 by the IST's Future and Emerging Technologies initiative. But what could a cognitive robot companion do?

"Well, that's a difficult question. The example that's often used is a robot that's able to fulfil your needs, like passing you a drink or helping in everyday tasks," says Dr Raja Chatila, research director at the Systems Architecture and Analysis Laboratory of the French Centre National de la Recherche Scientifique (LAAS-CNRS), and COGNIRON project coordinator.

"That might seem a bit trivial, but let me ask you a question: In the 1970s, what was the use of a personal computer?" he asks.

It's a good point. In fact, it was then impossible to imagine how PCs would change the world's economics, politics and society in just 30 years. The eventual uses, once the technology developed, were far from trivial.

COGNIRON set out on the same principle, given that society is constantly evolving, and the project partners hope to tackle some of the key issues that need to be resolved for the development of a cognitive robot companion, which could be used as assistants for disabled and elderly people or the general population. Who wouldn't like, for instance, their breakfast ready when they awoke, deliveries accepted while they were at work and their apartment cleaned upon their return?

The key issue governing these tasks is intelligence and developing intelligent behaviour on a number of fronts, the corner stone and main work of COGNIRON.

Organised around seven key research themes, the project studies multimodal dialogues, detection and understanding of human activity, social behaviour and embodied interaction, skill and task learning, spatial cognition and multimodal situation awareness, as well as intentionality and initiative. Finally, the seventh research theme, systems levels integration and evaluation, focuses on integrating all the other themes into a cohesive, cogitating whole.

Dr Chatila summarises the purpose of the seven themes. "Research breaks down into four capacities required by a cognitive robot companion: perception and cognition of environment; learning by observation; decision making; communication and interaction with humans."

Decision-making is a fundamental capability of a cognitive robot whether it's for autonomous deliberation, task execution, or for human-robot collaborative problem solving. It also integrates the three other capacities: interaction, learning and understanding the environment.

"Getting a robot to move around a human, without hurting them, and while making them feel comfortable, is a vital task," says Dr Chatila.

To work, it means a robot must pick up subtle cues. If, for instance, a human leans forward to get up, the robot needs to understand the purpose of that movement. What's more, much of human communication is non-verbal, and such cognitive machines need to pick up on that if they are to be useful, rather than irritating.

Even in verbal communication there are many habits robots need to acquire that are so second nature to humans that we never think of them. "For example, turn taking in conversation. Humans take turns to [talk], we need to find a way to make robots do the same," says Dr Chatila. A robot that keeps interrupting would get on an owner's nerves.

To tackle the problems, the researchers took inspiration from natural cognition as it occurs in humans, which is one reason why a cognitive robot companion needs to be able to learn.

Despite its highly ambitious aims the project made enormous progress and the team feel confident they will meet their criteria for success: three concrete implementations, the so-called 'Key Experiments' being implemented on real robots for the integration, demonstration and validation of the research results.

One experiment will feature a robot building a model of its environment in the course of a home tour, another will feature a curious and proactive robot that will be able to infer that a human needs something to be done, while the third one will demonstrate a robot's ability to learn by imitation and repetition.

In fact, the project has already partially implemented all three experiments, eighteen months before the project ends. "The three experiments are an expression of our achievement in research and integration," says Dr Chatila.

He emphasises that this is a promising start, but it will be a very long road before a fully functional Cognitive Robot Companion will be realised and potentially commercialised. COGNIRON will advance the state-of-the-art and understanding of the different components required but will not yet allow a fully integrated robot endowed with all the required capacities to be built.

Robosapien: The new evolution of man?

The Robosapien is the first affordable intelligent entertainment humanoid of its kind. Developed by robotics physicist Dr. Mark W. Tilden, Robosapien is the first robot based on the science of applied biomorphic robotics, enabling him to act more like a human. Tilden, who developed applied biomorphic robotics, has worked for NASA and other government research agencies developing advanced robotic technologies.

Fiesty and filled with personality, Robosapien is a humanoid with attitude that comes to life at your command. Using the ergonomic remote control, you can command Robosapien to perform up to 67 pre-programmed functions including pick-up, throw, high-five, whistle, dance and three different karate moves.

Robosapien reacts to both touch and sound signals from his environment, and sensors in his feet allow him to recognize and avoid obstacles without help. He has two types of three pronged grippers that enable him to pick up objects like cups, socks, pencils and other small light objects.

Robosapien also comes equipped with fast, fully articulated arms, and his fluid biomechanical movements and pendulum walking motion make his movements appear more human than robotic. Robosapien's impressive flexibility is evident as he walks in two different speeds, dances and 'turns on a dime'.

Robosapien is manufactured by Wow Wee Toys Ltd., which is a leading manufacturer of interactive high-tech toys and innovative electronic entertainment products.

Humanoid Robot is Going To School

The world's most advanced humanoid robot is headed to school this spring. ASIMO (Advanced Step in Innovative Mobility) will perform a special demonstration in March 2004 at the school that submits the winning entry to the ASIMO Essay Contest, a national contest on robotics. All public and private elementary, middle, junior high and high schools in the 48 contiguous United States are eligible to enter.

"The goal of the ASIMO Essay Contest is to encourage students across the nation to dream about the future of robotics," said Jeffrey Smith, leader of the ASIMO North American Project. "But more importantly, with this contest we hope to inspire students to work together to make their dreams a reality."

The winning school will receive an educational and interactive demonstration about robotics featuring ASIMO. This presentation will illustrate ASIMO's technical capabilities, including walking forward and backward, balancing on one leg, dancing and even climbing stairs. Students will also learn how ASIMO was developed, understand the challenges of creating humanoid robots and explore potential future applications for robotic technologies.

To enter the contest, each school elects one class or group of students to represent the school in the competition. This designated group submits an essay of 1,000 words or less describing their vision for the future role of humanoid robots. Entries are due by December 31, 2003. Information and contest entry forms are available at www.asimo.honda.com.

This school visit will be the final stop on the "Say Hello to ASIMO" North American Educational Tour, a 15-month national educational tour presented by American Honda Motor Co., Inc., a world leader in advanced mobility. The tour reaches out to students across the country through the nation's top science museums and educational institutions to create a unique educational experience inspiring young students to pursue academic study in the sciences.

To date, the "Say Hello ASIMO" North American Educational Tour has attracted full-capacity audiences at the Liberty Science Center (Jersey City, N.J.), the Museum of Science and Industry (Chicago, Ill.), Carnegie Mellon University (Pittsburgh, Pa.), MOSI (Tampa, Fla.), the Museum of Science (Boston, Mass.), SciTrek, the Science and Technology Museum of Georgia (Atlanta, Ga.), the Washington Convention Center (Washington, D.C.), The Franklin Institute (Philadelphia, Pa.), the Ontario Science Centre (Toronto, Ontario) and the Montreal Science Centre (Montreal, Quebec). Future tour stops include: Seattle, Wash.; San Jose, Calif.; and Los Angeles, Calif. Since the launch of the tour in January 2003, more than 67,000 students and museum visitors have seen ASIMO in person. More information about the tour is available at www.asimo.honda.com.

ASIMO was developed by Honda Motor Co., Ltd. after more than 17 years of research. Created for the purpose of someday helping people in need, ASIMO can walk forward and backward, turn smoothly without pausing, climb stairs and maintain balance while walking on uneven slopes and surfaces. ASIMO also has two arms and two hands, which ease such tasks as reaching for and grasping objects, switching lights on and off, and opening and closing doors.

i-Cybie

i-Cybie is an intelligent, interactive robotic cyber dog specifically designed to react and respond like a real dog. Made of 1,400 parts and over 90 feet of wire, Tiger's latest canine friend will happily wander around your house, greet you, wag his tail and give you his paw. The ideal pet for the 21st century, i-Cybie has all the love and entertainment of a real dog but without the fuss. He will happily perform and play for you, keeping you constantly entertained.

With a choice of metallic blue or gold, streamlined body and legs, i-Cybie is the ultimate hi-tech hound with a fantastic personality. Displaying four main emotions, i-Cybie shows you when he is happy, sad, hyper or barking mad! His behaviour will reflect these changing moods. 16 motors drive i-Cybie's joints, giving him total flexibility and realistic movement. With smooth and slick manoeuvrability, i-Cybie has 14 doggie actions all activated by remote control, voice or sound commands. He will sit, beg, rollover, shake his head, act as a guard dog and even cock his leg up! His eyes will also reflect his mood with 6 different eye patterns, for example when he is hyper both eyes will be red!

Advanced voice recognition technology allows i-Cybie to recognise your voice, he will respond to eight commands including 'Good Boy,' 'Bad dog,' 'Sit down,' 'Stay,' and 'Guard.' Clapping can also be used to command i-Cybie. Special sequences of 8 claps are included with your i-Cybie. When in trick mode there are 8 clap additional clap commands that tell i-Cybie to entertain you. Fully flexible and amazingly agile i-Cybie can perform a number of different tricks and acrobatic movements. Marvel as he does a headstand, falls back into a crab position, dances, wags his tail, gives paw and scratches his ear. He's a real show-off!

A clever canine, i-Cybie has a series of intelligent sensors that allow him to react to sound, light, touch and his physical surroundings. Watch him rub his head into his owner's hands when his head is patted! He is able to stand up if he falls over, avoid walls, the edges of tables and other elevated surfaces. Watch him navigate a route, avoiding obstacles in his way!

To make i-Cybie perform simply press the correct sensor command button. These are located on different parts of his body, his head button tells i-Cybie to listen, his orientation/balance sensor tells him he has fallen over and his back button tells i-Cybie to sit or stay. And, he is so smart that when his batteries are running low, he'll let owners know!

i-Cybie's is available now from all good gadget shops and retailers, although please be aware that there is limited stock in the UK market place.

PaPeRo - NEC's Personal Robot

NEC's new personal robot named 'PaPeRo' (aka 'Partner-type Personal Robot') is unlike anything seen so far in robot development, with its natural expressions and ability to remember its owners' interests and preferences. Following in line with NEC's goal of finding solutions for the "i- society", PaPeRo was designed to bring about a more natural interface with which people can easily and unconsciously benefit from the Internet.

PaPeRo can recognize 650 phrases and speak more than 3000, and with the latest image recognition technology PaPeRo has the ability to recognize peoples' faces. Though voice recognition technology PaPeRo also makes the following possible:

Easy access to the Internet without use of a keyboard - notifying its owner of incoming messages and giving updated information.

Strengthening communication channels among family members with its ability to convey video messages.

Interaction with its owner in various ways, such as, dancing playing games, reminding, telling the time, and remote operation of TVs and other electrical appliances in the home.

Its ability to interact naturally with people opens up a variety of application possibilities for home automation systems including: support for implementing safety measures, support for elderly, emergency communications systems, and home security. With the use of the Internet and software, the robot also makes possible a wide range of other applications, such as, tutoring children and providing remote care for sick and disabled people.

PaPeRo has been developed using the latest technologies:

It has two cameras for eyes that provide a stream of visual data, analyzed in real-time, enabling it to recognize people and avoid bumping into objects such as furniture.
Its "ears" are comprised of four microphones. Three microphones are used so the robot can detect voices, and the forth is used to understand instructions given to it from an increased select vocabulary.
Movement is based on mechatronics specially developed for the robot, and consists of a simplified control structure and modularized components.
Software technologies, such as graphical editors that enable easy programming of actions, dialogue and behaviors.
High-integration technology providing a stand-alone architecture.

Why build robots?

The increasing pace of the information technology in computers and communications is proving overwhelming for some people. Despite the emergence of more and more appliances offering more convenience and functionality users must cope with increasingly complex operating instructions. From children to the elderly there is a growing voice for technology that is simpler to use.

Thanks to the advances made in semiconductor and mechatronics technologies, it has now become feasible to develop home robots. This, together with the focus and demand on the development of technology for the individual rather than "future", has meant that attention given to the development of robots has grown significantly.

NEC believes robots that live with, and have the ability to interact with humans in many different ways will open up many possibilities. NEC plans to conduct further research and development using PaPeRo in various environments and locations, not only to advance the current technology, but also to improve interaction between humans and robots.

"The aim of our research at NEC is not just to further robot technology, but to examine and develop better human-machine interface through the concept of "living with robots," said Yoshihiro Fujita, Project Manager, NEC Incubation Center.

NOMAD - The Thinking Robot

It is not science fiction. Researchers at The Neurosciences Institute in La Jolla have designed a machine that thinks.

The machine's brain is called Darwin, after the 19th century biologist who conceived the theory of natural selection. Under Institute director and Nobel laureate Gerald Edelman, M.D., Ph.D., the Darwin series of thinking brains began in the mid-1980s. Today, Darwin 6 consists of a realistically designed simulation of a nervous system housed in a mobile platform called NOMAD (Neurally Organized Mobile Adaptive Device).

The research is conducted in the Institute's W.M. Keck Foundation Laboratory of Machine Psychology. Established in 1998 with a grant of $1.5 million from the W.M. Keck Foundation of Los Angeles, the Keck Laboratory studies the neural bases of behavior and how the brain reacts and adapts to a changing world. Its objective is to develop a new generation of powerful models of brain activity. Unlike a robot, NOMAD is an autonomous "being," used as a tool to study how the brain controls behavior. According to neuroscientist Jeffrey Krichmar, Ph.D., NOMAD is at the behavioral level of an infant.

"NOMAD starts naive and learns from experience. It has a preference for light and a specific taste, but no other experience or programming." Krichmar explained.

NOMAD's behavior is controlled by the activity of its simulated brain cells, allowing researchers a unique window into how the human brain works and how brain mechanisms produce the range of behaviors associated with higher brain functions. NOMAD can interact with its environment by sensing light and taste and by moving around and grabbing play blocks with striped or spotted patterns.

"Since NOMAD is attracted to light, it will steer toward a block and pick it up. When it grabs the striped block, it gets an electrical charge," explained chief engineer James Snook.

"In the simulated brain, this conductivity registers as good taste. Blocks with spots give no charge, hence, bad taste. As NOMAD's gripper holds the block, the brain associates the taste with the pattern it sees. After learning, it will stop picking up bad tasting blocks. It will approach them and after seeing the pattern, will remember that they taste bad and move away."

"We are adding a third sense to NOMAD's repertoire; an auditory system," said Krichmar. The simulated auditory system has areas to categorize and locate a sound, he added. A tone is associated with the taste of the block (high-pitched from a striped block, low-pitched from a spotted block). When the block detects NOMAD's presence, it starts to beep.

Future plans are to give NOMAD a long-term memory that will enable it to remember objects and events and put them into context.

"Our main objective is to use NOMAD to test theories of the brain," Krichmar explained. "By analyzing its brain we hope to better understand how the human brain works. With this brain we can also model neurological diseases."

The implications of this research may include the development of better diagnostic tools for patients with neurological diseases, and improved methods for learning.

"Perhaps most exciting," Snook added, "will be the development of new pattern-recognition devices, based on the brain, that will communicate with digital computers."

Other discoveries at The Neurosciences Institute include demonstrating that fruit flies sleep, which could offer clues into sleep disorders. The Institute also showed that instinctive behavior can be transferred between one species of animal to another by transplanting early brain regions of the quail to chickens.

Founded in 1981, The Neurosciences Institute is an independent, non-profit scientific research organization that studies the biological bases of higher brain functions, such as consciousness and memory. It is supported entirely by private donations.

Cye Personal Robot

From Probotics, Inc. comes a new smart and affordable personal robot. Called Cye, the compact personal robot can do a wide variety of tasks such as carry dishes, deliver mail, lead guests to a conference room and vacuum the carpet. Using wireless communications technology, Cye is controlled by Map-N-Zap, a highly intuitive graphical user interface (GUI) loaded on any PC with a 133 MHz Chip or higher.

"Cye is a practical and inexpensive robot that's easy and fun to work with," said Henry Thorne, CEO of Probotics and robot guru. "Anyone who enjoys remote control devices will be thrilled with Cye's unbelievably nimble and intelligent navigation. PC lovers will find Cye an incredibly exciting peripheral that they can operate using basic point-and-click skills."

He added, "The Cye personal robot is ideal for robot hobbyists, who will think up all kinds of uses for Cye and exploit its design that allows them to easily add hardware and create new features of their own. Because of Cye's open software architecture, developers and programmers will be able to write their own cyeware."

With its PC based interface, users can Drag-N-Drive Cye on the screen and map out its environment. Cye can easily move around a home or office and learn how to navigate a new room in minutes. At a speed of three feet per second, Cye quickly moves around any room, pulling the optional wagon or vacuum attachment. Unlike other robots, Cye can follow scheduling instructions. By point-and-clicking on menu options written in plain English, users can schedule when and where they'd like Cye to go, and then Cye will automatically move there. For example, users can schedule Cye to go to the dining room at 7:00 p.m. and carry the dishes to the kitchen; vacuum the office at 10:35 a.m.; and distribute mail at 11:00 a.m.

Affordably priced at $695, Probotics is today shipping Cye in limited quantities. Cye accessories include: a wagon attachment ($89) and a vacuum attachment ($89) that allows users to attach any upright vacuum cleaner onto Cye. Cye is available in yellow, orange and black by visiting theProbotics web site.

Cye can be set up and working in just 15 minutes. To get started, users just plug the home base in, drop the robot on it, plug the radio pod in, connect it to their PC, and load the Map-N-Zap software from the CD. Cye can then be driven around by dragging its icon on the PC monitor. A compact personal robot, it measures 16" x 10" x 5" and weighs nine pounds. Cye communicates to and from the PC 10 times per second via a FCC approved 900 MHz radio link. The Cye Robot Package includes a fully assembled and tested Cye robot, base station, cables, AC adapter, Map-N-Zap software, operations manual, email tech support and one year manufacturer's warranty.

Cye comes with several tutorials including:
CyeServe: Have Cye bring food and drink to your friends
CyeTruck: Have Cye carry your dishes back to the kitchen
CyePup: Have Cye wait for you, wag its tail, go to sleep
CyePost: Have Cye deliver mail around the office
CyeGuide: Have Cye meet visitors and lead them to your office

Also available is a sound-responsive Cye model, the Cye-sr. Cye-sr is programmed to respond to claps. After one clap, Cye-sr beeps to let you know you've got its attention. To send Cye-sr to a destination, clap to indicate where you'd like it to go. For example, use two claps to send it to the kitchen, three claps to send it to the den and one clap to come back to its home base (charger). Cye-sr can also be navigated using the mouse on the PC, which is linked to the robot via wireless communications.

iRobot-LE

From iRobot Corporation comes the first multi-purpose domestic robot that can be controlled through a web browser from anywhere in the world. This revolutionary product, called the iRobot-LE, gives its owners on-demand remote eyes and ears into their homes. With this roving telepresence capability, owners can drive around their home making sure it is secure, say goodnight to kids when away on business, check up on pets, and visit with elderly or house-bound relatives and friends.

"The iRobot-LE lets busy professionals be in two places at the same time," says Helen Greiner, President and co-founder of iRobot. "You control your iRobot-LE over the Internet and see video and hear audio from the iRobot-LE on your computer. You can wander your home in Boston while sitting behind your laptop in San Francisco. You can visit your kids and check up on their nanny - seeing what's going on in your house in realtime provides peace of mind."

The iRobot-LE goes anywhere a person can comfortably walk. Using Surefoot Stair climbing Technology, the iRobot-LE can go up and down stairs unassisted. The robot avoids objects using advanced sensor and signal processing technology - if you tell it to run into a wall, it's smart enough not to. As the iRobot-LE explores its home, its sophisticated artificial intelligence allows it to learn the layout and build a floor plan that it uses to navigate. A person controlling the iRobot-LE through a web browser can participate in conversations over the Internet. The robot's camera can be turned to address people and to look at people as they talk. If there's something more interesting happening in another direction, the iRobot-LE can be instructed to drive over there.

The iRobot-LE is powered by iRobot Aware Robot Control Software running on a Pentium II class processor with an Apache-SSL web server and Linux Operating System. Priced about the same as a high-end notebook computer, the iRobot-LE is a fully functional mobile computer.

"The iRobot-LE takes the Internet beyond the Web to the next level - no longer is the Internet just for the exchange of information. Now it lets you really travel around the world, and drop in on friends, business acquaintances, check in at home, or visit an exotic locale, from any Web browser anywhere in the world," says Prof. Rodney Brooks, Director of the Artificial Intelligence Lab at MIT, Chairman, and co-founder of iRobot. "This remote presence technology goes beyond chat rooms, e-mail, and way beyond teleconferencing to bring people together. iRobot technology will change the way we perceive time and distance making the world a truly global village."

Robomow RL500

Think about it: The average British gardener spends sixty hours per year cutting the grass. Operating a petrol-powered lawnmower for one hour creates the same amount of smog forming emissions as 40 new cars running for the same period of time!

Now, from Friendly Robotics comes the solution; the Robomow RL500.

Robomow RL500 is a fully automatic robotic lawnmower. It is powered by rechargeable batteries, it is quiet, there are no smelly fumes and it will cut an area the size of a tennis court - approximately 250 square meters - before it needs a recharge.

Place Robomow RL500 anywhere on the lawn and press the GO button. It can work out for itself where it is and it will quietly get on with the job of cutting the entire lawn without any further assistance from its owner. Robomow RL500 will operate for about two hours on one charge, leaving a perfectly mown lawn with no clippings to dispose of. Robomow RL500 is a mulching mower. It cuts the grass into tiny pieces and drives them back into the lawn where natural chemical breakdown will recycle the nutrients and moisture in the grass and actually feed the lawn. RobomowRL500 will turn itself off when it finishes work, and can be driven back to its shed for recharging, using a special integral manual controller. This easy-to-use controller allows you to mow manually where preferred. Robomow RL500 takes 24 hours to recharge, so it can be used every day if necessary. Due to its unique guidance and perimeter recognition system, RoboScan it can work unattended - even at night if required.

Installation is a simple one-off operation. Robomow RL500 comes in a box with everything you need to self-install, including an explanatory video and an operator's manual. You simply peg a thin wire onto the edge of your lawn and connect it to a small cup sized switch. When turned on the switch sends a small electrical signal around the garden and defines the area within which the Robomow operates.

ROBOMOW RL500 TECHNICAL SPECIFICATION

Dimensions: 89x66.5x31.5cm (35" x 26" x 12.4")
Weight: 19kg + 13kg (Battery Pack) 421b = 291b (Battery Pack)
Cutting Width: 53cm width using 3 Blades 21" width using 3 blades.
Typical Lawn Size: Cuts 250 m2 on a single charge
Battery Pack: 2 x 12V Sealed lead acid gel batteries, maintenance free.
Charging Time: 24 Hours
Operating Time: Approximately 2 hours.
Noise Level: Below 82 dB (A)
Mowing Positions: 6 positions at the front and 3 positions at the rear.
Blade Speed: 5800 RPM

Dyson DC06

Possibly the most intelligent domestic appliance ever made is now being assembled at the Dyson research and development centre and is on home trial. To date, over 1100 people who are interested in buying DC06 have contacted Dyson. The Dual Cyclone robotic vacuum cleaner same high level of pick-up efficiency demanded of all Dyson Dual Cyclone vacuum cleaners. All you need to do is press 'on', choose a speed, press 'go' and your room will be vacuumed.

DC06 has over 50 sensory devices which constantly feed data into the 'brain' of the machine, the 3 onboard computers. Using this data it makes 16 decisions per second and will constantly adjust to navigate its way around a room.

DC06 does not need to be programmed. It 'thinks' for itself and therefore can clean a room on its own. It's intelligence stops it falling down stairs and will pause the machine if a dog or child gets too close. DC06 can even tell you how it's feeling. Its mood light is blue for happy, green when it is moving around an obstacle and red when it feels in danger.

Dc06 has been developed over the last 2 and a half years to solve the two major problems of automatic vacuum cleaning: how to make a small cleaner pick up dust as well as a big one and how to make a machine intelligent enough to cover a room thoroughly.

Technological advancements such as DC06 could mean that in the future no human has to do their own housework.

A new technology electric motor has been specially designed. It has no carbon brushes, which removes the need for a second filter and should last 4 times as long as a conventional motor.

"An automatic vacuum cleaner must clean as well as the best mains powered vacuum cleaner and with the sort of methodical coverage a human being couldn't possibly achieve." James Dyson.

DC06 comes in silver and yellow, costs around £2,500 and weighs only 9.2 kg. It also comes with an attachable hose for stairs and upholstery which means it is the only vacuum cleaner you will need. Like all Dyson vacuum cleaners DC06 has no bag and therefore no clogging. High performing Dyson Dual Cyclone™ technology also alleviates other problems associated with bags: tearing, bursting and puffing dust.

Palm Pilot Robot

Researchers at Carnegie Mellon University's Robotics Institute, working in the toy and entertainment area, have developed an easy-to-build, autonomous robot controlled by a Palm handheld computer. The system,originally built with off-the-shelf components, has been commercialized by Acroname, Inc., of Boulder, Colo., which is selling it in a kit.

"The Palm Pilot Robot was created to enable just about anyone to start building and programming mobile robots at a modest cost," said Illah Nourbakhsh, assistant professor of robotics and head of the institute's Toy Robots Initiative. "The Palm makes a handy robot controller. It packs a lot of computational power in a small size, runs on batteries, and best of all, can display graphics and an interactive user interface."

Robotic elements built into the base on which the Palm sits empower it to move about on flat surfaces and sense its nearby surroundings. The base is equipped with three "omni-wheels" with independent control of rotation that allow movement in any direction. The base also incorporates three optical range sensors, enabling the palm robot to "see" the world up to about a meter away and sense nearby obstacles and walls.

Complete construction plans and software for the Palm Pilot Robot are documented on its Web site and can be downloaded and installed directly on the Palm. The source code is also available and can be modified, compiled and installed on the Palm as well. In addition, there are libraries that greatly simplify the programming of the robot.

Nourbakhsh collaborated on the development of the Palm Pilot Robot with Computer Science Professor Matthew T. Mason and his laboratory assistant, Grigoriy Reshko, a freshman in Carnegie Mellon's School of Computer Science. The project grew out of earlier work in Mason's Manipulation Laboratory where he and Reshko were developing easy and inexpensive rapid prototyping of small robots using simple construction techniques and plastic gear motors.

Mason had envisioned a small tabletop robot that could tidy up a desk. Nourbakhsh was thinking of something students could use in a high schoo setting, and the 17-year-old Reshko had the mindset and technical expertise to combine their visions. When Reshko had ironed the bugs out of the robot a couple weeks ago, he released a pilot version at his Web site, which has since received more than 150,000 hits.

Acroname is a six-year-old company whose goal is to make robotics easier by providing parts and descriptions for better robots. Descriptions of their products and information on robotics can be found at their Web site.

Honda ASIMO Robot

From Honda Motor Co.comes a new small, lightweight humanoid robot named ASIMO that is able to walk in a manner which closely resembles that of a human being.

One area of Honda's basic research has involved the pursuit of developing an autonomous walking robot that can be helpful to humans as well as be of practical use in society. Research and development on this project began in 1986. In 1996 the prototype P2 made its debut, followed by P3 in 1997.

"ASIMO" is a further evolved version of P3 in an endearing people-friendly size which enables it to actually perform tasks within the realm of a human living environment. It also walks in a smooth fashion which closely resembles that of a human being. The range of movement of its arms has been significantly increased and it can now be operated by a new portable controller for improved ease of operation.

ASIMO Special Features:
Smaller and Lightweight
More Advanced Walking Technology
Simple Operation
Expanded Range of Arm Movement
People-Friendly Design

Small & Lightweight Compared to P3, ASIMO's height was reduced from 160cm to 120cm and its weight was reduced from 130kg to a mere 43kg. A height of 120cm was chosen because it was considered the optimum to operate household switches, reach doorknobs in a human living space and for performing tasks at tables and benches. By redesigning ASIMO's skeletal frame, reducing the frame's wall thickness and specially designing the control unit for compactness and light weight, ASIMO was made much more compact and its weight was reduced to a remarkable 43kg.

Advanced Walking Technology Predicted Movement Control (for predicting the next move and shifting the center of gravity accordingly) was combined with existing walking control know-how to create i-WALK (intelligent real-time flexible walking) technology, permitting smooth changes of direction. Additionally, because ASIMO walks like a human, with instant response to sudden movements, its walking is natural and very stable.

Simple Operation To improve the operation of the robot, flexible walking control and button operation (for gesticulations and hand waving) can be carried out by either a workstation or from the handy portable controller.

Expanded Range of Movement By installing ASIMO's shoulder's 20 degrees higher than P3, elbow height was increased to 15 degrees over horizontal, allowing a wider range of work capability. Also, ASIMO's range of vertical arm movement has been increased to 105 degrees, compared to P3's 90-degree range.

People-Friendly Design In addition to its compact size, ASIMO features a people-friendly design that is attractive in appearance and easy to live with.

About the Name
ASIMO is an abbreviation for "Advanced Step in Innovative Mobility"; revolutionary mobility progressing into a new era.

Specifications
Weight: 43kg
Height: 1,200mm
Depth: 440mm Width 450mm
Walking Speed: 0 - 1.6km/h
Operating Degrees of Freedom*
Head: 2 degrees of freedom
Arm: 5 x 2 = 10 degrees of freedom
Hand: 1 x 2 = 2 degrees of freedom
Leg: 6 x 2 = 12 degrees of freedom
TOTAL: 26 degrees of freedom
Actuators: Servomotor + Harmonic Decelerator + Drive ECU
Controller: Walking/Operation Control ECU, Wireless Transmission ECU Sensors - Foot: 6-axis sensor
Torso: Gyroscope & Deceleration Sensor
Power Source: 38.4V/10AH (Ni-MN)
Operation: Work Station & Portable Controller

*Degrees of Freedom: The human joint has one degree of freedom for each range of movement; forward/ backward, up/down and rotation.

Cyber K'NEX

Cyber K'NEX is a new range of robots, dogs and racing cars which react to their surroundings using light, motion, infra-red and sound sensors, making independent decisions about their next actions! Each model is programmed using a Cyber Key, which gives its own distinctive character and brings it to life. Their movements can all be triggered by standard TV or stereo remote control units. At the top of the range, the Ultra set has its own programmable controller that allows children to trigger the models to act out different responses.

For example, Woof, the Cyber K'NEX dog, can detect when an intruder walks into a room and will ward them off by growling and snarling, while Mectron, the Cyber K'NEX robot will respond by speaking and flashing his lights or firing missiles from the rocket launcher in his chest.

Each Cyber K'NEX set can build at least three models, all with their own character determined by the Cyber Key which is simply plugged into the back of the model. Cyber Keys can be interchanged with each model that is made from the set and with the Ultra set, additional "personalities" may be downloaded from the K'NEX website.

Sets include the "Super Racers", - a range of futuristic vehicles, guaranteed to tackle everything in their path, the "Cybots" - 21st century robots and the "Ultra" set - building a range of models including Woof the Cyber K'NEX dog.

Sony 2nd Generation AIBO

Following on from the sale of the first ever autonomous entertainment robot AIBO ERS-110, Sony now introduce a 2nd Generation "AIBO" ERS-210 that has a greater ability to express emotion for more intimate communication with people. Available now, with no restriction on the number of units produced or the time period for orders: all customers ordering "AIBO" ERS-210 will be able to purchase a unit.

The new AIBO has additional movement in both ears and an increased number of LED (face x 4, tail x 2) and touch sensors (head, chin, back) which means that it can show an abundant array of emotions such as "joy" and "anger". In order to increase interaction with people, the ERS-210 series most distinctive feature, its autonomous robot technology (reacting to external stimulus and making its own judgements) that allows AIBO to learn and mature, has been enhanced. It will now include features frequently requested by AIBO owners such as a Name Recording Function (recognizes its own name). Voice Recognition (recognizes simple words) and Photo Taking.

The technologies that allow the ERS-210 to communicate, such as the autonomous feature which gives AIBO the ability to learn and mature plus the voice recognition technologies etc. will be available on a special flash memory AIBO Memory Stick software application (Autonomous AlBO-ware) called "AIBO Life" [ERF-210AW01] (*sold separately).

So that people can enjoy using AIBO in a variety of new ways an additional two application software (AlBO-ware), "Hello AIBO! Type A" [ERF-210AW02] demonstration software and "Party Mascot" [ERF-21 OAW03] game software (*both sold separately), are also being introduced. A new line-up of AIBO accessories such as a carrying case and software that enables owners to perform simple edits to AlBO's movements and tonal sounds on a home PC will also be offered to personalize the way owners can enjoy interacting with their AIBO.

Main Features of "AIBO" ERS-210

Three Different Color Variations

The [ERS-210] is available in three colour variations (silver, gold and black) so customers can choose the one that suits them best.

Autonomous Robot AIBO - actions based on own judgement

When used with Memory Stick application (AlBO-ware) "AIBO Life" (*sold separately) [ERF- 210AW01] AIBO acts as a fully autonomous robot and can make independent decisions about its own actions and behavior. AIBO grows up with abundant individuality by interacting with its environment and communicating with people by responding to its own instincts such as "the desire to play with people" and "the desire to look for the objects it loves".

Enhanced Features to Express Emotions

When used in conjunction with "AIBO Life" (*sold separately) AIBO [ERS-210] owners can enjoy the following features to their full capacity:

Touch Sensors on the head. chin and back
In addition to the sensor on the head, new touch sensors have been added to the back and under the chin for more intimate interaction with people.
20 Degrees of Freedom
A greater variety of expressions due to an increase in the degrees of freedom of movement from 18 on the [ERS-110] and [ERS-111] (mouth x 1, head x 3, tail x 2, leg x3 per leg) to 20 degrees of freedom with new movement added to the ears on the AIBO IERS-210].
LED on the Tail
In addition to LED (light-emitting diodes) on the face, LED have been added to the tail. A total of 4 LED on the face (expressing emotions such as "joy" "anger") plus 2 on the tail (expressing emotions like "anxiety" "agitation") allows AIBO to express a greater variety of emotions.

Enhanced Communication Ability with New Advanced Features

When used in conjunction with "AIBO Life" (*sold separately) AIBO [ERS-210] has the following features: Personalized Name (name recording & recognition): Owners can record their own personal name for "AIBO" and it will respond to this name by actions and emitting a special electronic sound. @ Word Recognition (voice recognition function): Depends on AlBO's level of development and maturity. The number of words and numbers AIBO can recognize will change as it grows up until it can recognize about 50 simple words. In response to the words it recognizes AIBO will perform a variety of actions and emit electronic sounds. Response to Intonation of Speech (synthetic AIBO language): When spoken to, "AIBO" can imitate the intonation (musical scale) of the words it hears by using its own "AIBO language" (tonal electronic language).

Photo Taking Function
If used in conjunction with "AIBO Life" and "AIBO Fun Pack" software applications (*both sold separately) AIBO will take a photograph of what it can see using a special colour camera when it hears someone say "Take a photo". Using "AIBO Fun Pack" software [ERF-PC01] photographs taken by AIBO can be seen on a home PC screen.

Wireless LAN Card
By purchasing a seperate IEEE802.11b Wireless LAN card [ERA-201D1], inserting it into a PC card slot and using "AIBO Master Studio" software (*sold seperately) the movements and sounds AIBO makes can be created on a home PC and sent wirelessly to control AIBO's movements.
Other Features
Open-R v1.1 architecture
Uses Sony's original real-time Operating System "Aperios".
The head and legs can be removed from the body and changed.