Washington, June 13 : The use of rescue robots may prove to be a great technological achievement in saving lives, as these robots search through rubble of a collapsed building or survey a chemical spill area. However, emergency planners, working away from the immediate disaster site face difficulty in co-ordinating with these robots due to lack of the images sent by them from the site.
Ideally the robots should be capable of beaming back clear, easily interpretable images, and to do so, scientists have come up with a new ASTM International standard, developed under a National Institute of Standards and Technology (NIST) coordinated program with first responders and manufacturers.
This standard would now offer a systematic way to evaluate the visual capability of the robots, needed for driving the device at disaster site, search for victims and access general hazard conditions.
The test data would enable the emergency personnel to select the best systems for their specific needs. In fact, if adopted in industries, the standard may accelerate innovation, development and deployment of the life-saving robots.
In reality, the working of these life-saving robots is quite different from what we see in science fiction movies where robots are readily interpretable by remote operators.
However, real-time color video images from urban search-and-rescue robots reflect the type of sensors or camera lens used. A zoom lens, for instance, can be like looking through a soda straw, yet it could be useful in focussing on certain important objects. Similarly, images from a lens offering a wide field of view, such as 120 to 150 degrees, offer little depth perception and are of little use for navigating in tight quarters but, in case of aerial robots and ground vehicles, they can provide useful survey data.
In such cases, both far-vision acuity and near vision acuity can be important for surveys of HAZMAT disaster sites, with the far-vision cameras providing the overall picture and the near-vision acuity playing a critical role in reading chemical labels. (Near-vision acuity also is critical for small robots that must operate in confined spaces.) Finally, the amount of available light can affect monitor images.
The new standard's test methods measure the field of view of the camera, the system's visual acuity at far distances with both ambient lighting and lighting onboard the robot, visual acuity at near distances, again in both light and dark environments, and visual acuity in both light and dark environments with zoom lens capability, if provided.
The results turn out to be useful for writing procurement specifications and for acceptance testing of robots for urban search and rescue applications.