The Defense Advanced Research Projects Agency is inviting contract and cooperative agreement applications for innovative research on automated technologies for deep natural language understanding. The Deep Exploration and Filtering of Text (DEFT) program seeks to develop the ability to see through language to meaning in text, to make use of key information contained in text documents, to cue up information sources that contain new developments for analysts, and to automate the initial stages of report writing. DEFT aims to enable analysts to discover implicitly-expressed, actionable information.
New technology being developed by Dr. Yorick Wilks and his team at the Institute for Human and Machine Cognition (IHMC) will soon automate reading, with an emphasis on national security. In the security world, a lot of vital information is found in blogs between “people of interest” — that is, people of security interest to the government.
A serious difficulty arises because the number of these blogs, as well as the volume of words and information they contain, is enormous. In fact, it’s far too large for any team of analysts to read and uncover the relevant information buried within. Dr. Wilks’s team is setting out to show that a computer can read, understand, and filter such blogs to identify those that would be of the greatest interest to human analysts.
Carnegie Mellon, with DARPA support, is already at work on a synthetic reality project to develop programmable matter. As their website describes:
The goal of the claytronics project is to understand and develop the hardware and software necessary to create a material which can be programmed to form dynamic three dimensional shapes which can interact in the physical world and visually take on an arbitrary appearance. Claytronics refers to an ensemble of individual components, called catoms—for claytronic atoms—that can move in three dimensions (in relation to other catoms), adhere to other catoms to maintain a 3D shape, and compute state information (with possible assistance from other catoms in the ensemble). Each catom contains a CPU, an energy store, a network device, a video output device, one or more sensors, a means of locomotion, and a mechanism for adhering to other catoms.
The power and flexibility that will arise from being able to “program” the world around us should influence every aspect of the human experience. In our project we focus in on one particular aspect of the human experience, how we communicate and interact with each other. Claytronics is a technology which can serve as the means of implementing a new communication medium, which we call pario. The idea behind pario is to reproduce moving, physical 3D objects. Similar to audio and video, we are neither transporting the original phenomena nor recreating an exact replica: instead, the idea is to create a physical artifact that can do a good enough job of reproducing the shape, appearance, motion, etc., of the original object that our senses will accept it as being close enough.
Raytheon and Northrop Grumman are developing new systems and concepts for close air support using an unmanned version of the twin-engine A-10 Thunderbolt II. The companies received contracts worth $7 million each in April 2011 under phase one of the U.S. Defense Advanced Research Projects Agency (Darpa) Persistent Close Air Support (PCAS) program.
For years, the US military has been hoping to develop “micro air vehicles” – ultra-small flying robots capable of performing surveillance in dangerous territory. Building these machines is not easy. The dynamics of flight change at very small sizes, and the vehicles need to be lightweight enough to fly, yet strong enough to carry cameras and other equipment. Most formidably, they need a source of power, and batteries light enough for microfliers just don’t have enough juice to keep the crafts aloft for very long. Consider the tiny, completely synthetic drones that engineers have managed to create: the DelFly Micro, which measures less than 10cm from wingtip to wingtip, can stay airborne for just three minutes.
The government agency released a video yesterday that highlights one of LS3’s most powerful skills: the ability to follow a leader by using computer-aided vision and GPS. In the four-minute clip, you can watch the dog-like robot following an instructor over some rough terrain — with great ease — in a wooded area near Fort Pickett, Va.
DARPA’s Robocheetah V2 smashed the robot land speed records this week, clocking speeds of 18 mph.
The robot has been designed as a part of research into the limitations of bomb-disposal bots.
DARPA is aiming to improve the mobility of and flexibility of robots which, they say, would help them “more effectively assist warfighters across a great range of missions”.
The high-tech whizzes at DARPA, the military research arm of the Defense Department, displayed some breakthrough technology for space, ocean, robotics and ground war at a Congressional Tech Showcase in Washington earlier this month.
But the most inspiring tech was an innovation underway for America’s veterans who return home with upper limb loss. The Johns Hopkins Applied Physics Lab Modular Prosthetic artificial limb, part of DARPA’s Revolutionizing Prosthetics Program, is among the most sophisticated arms ever made.
The artificial limb moves like the real thing, and it can do just about everything. Built over the course of five years, it makes it possible to play the piano, toss a ball, pick up a cup and sip some coffee from it.