By Heather Kelly, CNN
For a tiny bug-shaped robot made of cardboard and plastic, Dash is surprisingly advanced.
The new $65 DIY programmable robot is built for tinkering. It comes with a gyroscope, visible light and infrared sensors, and an iOS app for controlling it over Bluetooth 4. The Arduino-compatible bots also have LED lights and additional ports for expanding and hacking the Dash.
You can program in your own behaviors, making the robots move in patterns or follow walls. They can operate as a swarm and cooperate, or set off on individual tasks. Add in touch sensors and turn two peace-loving Dash robots into battlebots that fight each other and keep score on their multi-colored LEDs.
"Our goal is to get a robot into everyone's hands because we think they're great educational tools," said Nick Kohut, one of the founders of Dash Robotics.
Conveniently, Dash will fit right into in the palms of those hands. It has six legs, weighs about half an ounce, and its killer feature is being able to move quickly over various kinds of terrain. It can cover five to six feet a second and is able to cross sand, concrete and other surfaces.
By Doug Gross, CNN
If you watch two? Send us the other one.
RHex is a creation of researchers at the University of Pennsylvania who hope it could one day climb rubble in emergency rescue situations or zoom across scorching desert sands with its six whirling, springy legs.
"What we want is a robot that can go anywhere, even over terrain that might be broken and uneven," said graduate student Aaron Johnson, one of those researchers. "These latest jumps greatly expand the range of what this machine is capable of, as it can now jump onto or across obstacles that are bigger than it is."
RHex (short for "robot hexapod" and pronounced "Rex") is actually more than a decade old, the brainchild of a multiuniversity project. But Penn researchers recently created a new version - called X-RHex Lite - that, as its name suggests, is lighter and more agile than previous versions.
The result: a moving rectangle that has, in effect, been taught robot parkour.
In the video posted late last month, RHex charges across Penn's campus (with an appropriately epic soundtrack) before showing off an impressive vertical leap, doing several back flips and propelling itself up steps.
Its most impressive moments, though, might be jumping from one picnic table to another over a gap greater than its own length and flipping up on a tall stone block, grabbing on with its curved front legs and pulling itself upward.
On robots, legs are more effective than wheels when it comes to rough terrain. But it can be complicated to teach the human-like legs on walking robots how to respond to unpredictable conditions. RHex's simple, one-jointed legs are better suited to getting around obstacles in creative ways, the Penn team says.
By Heather Kelly, CNN
Would you let a robot take over as a live-in nurse for your aging parent or grandparent?
In 2050, the elderly will account for 16 percent of the global population. That's 1.5 billion people over the age of 65, according to the Population Reference Bureau. Caring for those seniors - physically, emotionally and mentally - will be an enormous undertaking, and experts say there will be a shortage of professionals trained and willing to take on the job.
"We have to find more resources and have to get new ways of delivering those resources and delivering the quality of care," says Antonio Espingardeiro, an expert in robotics and automation at the University of Salford in Manchester, England, and a member of the IEEE Robotics and Automation Society.
Enter the elder-care robot.
Robots have the potential to meet many of the needs of an aging population, according to Espingardeiro. A software engineer, Espingardeiro is finishing his PhD on new types of human and robotic interaction. He has developed a model of elder-care robot, P37 S65, which can monitor senior patients and communicate with doctors while providing basic care and companionship. FULL POST
By Heather Kelly, CNN
At six-foot-two and 330 pounds, this hulking first responder has all the qualities you'd want in the field after a disaster: strength, endurance and calm under pressure. Better yet, it has two sets of hands, 28 hydraulic joints, stereo cameras in its head and an onboard computer.
The ATLAS humanoid robot, which looks vaguely like something from the "Terminator" movies, was created by Boston Dynamics for DARPA, a research arm of the U.S. Department of Defense. It will compete in the DARPA Robotics Challenge (DRC), a competition that invites engineers to create a remotely controlled robot that can respond to natural or man-made disasters.
The winning robot could be used in situations deemed too dangerous for humans, like the 2011 nuclear disaster at Fukushima Daiichi Nuclear Power Plant.
The DRC is broken up into three challenges. The first was the Virtual Robotics Challenge, in which 26 teams controlled simulated, 3-D robots. Only seven of those teams - including participants from MIT, Carnegie Mellon, and NASA's Jet Propulsion Laboratory – were chosen to go on to the next stage. They will each get their very own ATLAS for the Robotics Challenge Trials, a real-life obstacle course competition between robots that will take place this December in Florida.
As part of the challenge, the teams will program their humanoid robot to accomplish a range of tasks. ATLAS will need to drive a car, navigate complicated terrain on foot and move rubble in order to enter a building. It will also have to climb stairs and use various tools to do things like turn off valves or break through concrete walls.
ATLAS has modular wrists so that it can swap out hands and attach third-party mitts to better handle specific tasks. The robot's head also has LIDAR to better gather information about the surrounding area.
The robots will need to be able to complete tasks on their own without constant human control, which will be a key feature if they are in situations where communications are spotty. DARPA also wants the final robots to be easily controlled by people who have had minimal amounts of training, so that the technology is accessible to more people on short notice.
The teams whose robots perform the best at the trials later this year will continue to receive funding and compete in the competition's final stage in December 2014. The Robotics Challenge Finals will put the robots through a full disaster scenario that will include eight tasks each robot must complete.
In addition to improving future disaster response, winners of the 27-month competition will receive a $2 million prize.
The ATLAS robots are the result of a $10 million contract with Boston Dynamics, the Massachusetts engineering and robotics-design company. That amount covers eight robots, in-field support and any necessary maintenance.
By Heather Kelly, CNN
A graduate student wearing a skull cap covered in wires sits perfectly still and thinks about making a fist with his right hand.
Nearby, a small quadcopter - a flying drone with four rotors - turns right. He imagines making a fist with his left hand and the robotic flying copter goes left. After a thought about clenching both hands, it lifts higher into the air.
He is controlling the device with his mind.
The system is part of a new research project that reads the brain's electrical activity and translates certain thoughts into commands for the unmanned aerial vehicle. It's called a brain-computer interface, and someday it could have important uses for people who are paralyzed.
"We envision that they’ll use this technology to control wheelchairs, artificial limbs or other devices," said University of Minnesota engineering professor Bin He in a post announcing the project.
This graduate student wears a special skull cap that allows him to manipulate the flying robot with his mind.
Here's how it works: Imagining specific movements without actually doing them produces electric currents in the motor cortex. The interface itself isn't new, but the researchers used brain imaging scans to find out exactly which imagined movements activated which neurons.
Once they mapped out the various thoughts and associated signals, they used them to control a helicopter simulation on a computer. Next, they moved on to real flying devices.
There are no implants or invasive brain tweaks needed for subject to control the copter with their brain. The technology is called electroencephalography (EEG). The skull cap uses 64 electrodes to detect these currents from a subject's brain as they think about associated actions, then translates that data into instructions and transmits them to the quadcopter over Wi-Fi.
In the test, pilots weren't allowed to look at the quadcopter while they controlled it, only a screen showing the view from a small camera mounted on the front of the flying vehicle. After a few hours of training, the subjects could move the quadcopters with precision, even guiding them through hoops suspended from the ceiling.
Flying is just the start for this technology, He said.
"It may even help patients with conditions like autism or Alzheimer’s disease or help stroke victims recover," he said. "We’re now studying some stroke patients to see if it’ll help rewire brain circuits to bypass damaged areas."
Brazilian-born Miguel Nicolelis is a professor of neurobiology at Duke University and a pioneer in the field of brain-machine-interfaces, in which brain waves from a human or animal control a robot-limb prosthethis. For more on Nicolelis and his work, watch "The Next List" this Sunday at 2:30 pm ET on CNN.
By Miguel Nicolelis
For the past 30 years, I have dedicated my career as a neurobiologist to unveil the physiological principles that underlie how our brain circuits, formed by billions of interconnected cells, known as neurons, create the entirety of our human nature and history out of sheer electrical brainstorms.
To pursue this quest, my colleagues and I at the Duke University Center for Neuroengineering have developed a variety of new methods and technologies to probe the brain in search of any hint, any glimpse that could place us on the right trail to answer the greatest mysteries of all times: how the entire wealth of the human mind emerges from a mesh of organic tissue.
In 1999, John Chapin, my former postdoctoral advisor, and I published a scientific paper that introduced to the neuroscience community what by then seemed to be just another promising new experimental tool in brain research. Without much ceremony, we named this new experimental paradigm brain-machine interfaces (BMIs) and, in a flurry of papers that followed the original report, we described the technical details of our unorthodox combination of neurophysiological methods, real-time computing and robotics to create a direct and bidirectional interface between living animal brains and a variety of mechanical and electronic machines.
In the late 1990s, our initial effort in building such devices was entirely motivated by the desire to establish a powerful experimental tool to carry on work related to the investigation of the neurophysiological principles that allow behavior, the true business of the brain, to emerge flawlessly and effortlessly, time and time again, from the widespread dynamic interactions of large populations of neurons that comprise any brain circuit.
By the time our original papers were published in scientific journals, very few people, outside a small number of experts working in the emergent field of BMIs, could envision the enormous clinical potential that this newly acquired ability to interface brains and machines could unleash and how it could influence the future of rehabilitation medicine.
What a difference 15 years make! After a mere decade and a half of intense research and stunning experimental demonstrations, brain-machine interfaces have become the core of a large variety of potential future new therapies for neurological disorders, such as untreatable epilepsy, Parkinson’s disease and devastating levels of body paralysis. Moreover, in the not so remote future, BMIs of a different variety may allow us to perform a lot of our daily routine tasks, such as interacting with our smartphones, just by thinking!
Welcome to the era of brain-actuating technology; the age in which the brain’s voluntary desire to move will be liberated from the physical limits of the human body that host it.
In the CNN show you are about to watch, you will be introduced to the Walk Again Project (WAP), the first worldwide, non-profit international brain research consortium aimed at building a new generation of robotic limb prostheses, which can be directly controlled by the subject’s own brain activity through a brain-machine interface. In the future, we hope that neuroprostheses such as the ones the WAP intends to build could be used to restore full-body mobility in tens of millions of severely paralyzed patients worldwide.
To showcase to the entire world that this moment could be fast approaching, the WAP has proposed to have the first public demonstration of such a potentially revolutionary medical rehabilitation technology during the opening football match of the FIFA 2014 Soccer World Cup on June 12, 2014, in São Paulo, Brazil.
According to this proposal, at 5:00 pm that afternoon, a Brazilian young adult, who is paralyzed below the waist down will emerge in the pitch wearing a robotic vest, known as an exoskeleton, whose movements are controlled by some sort of brain-derived signals. Then, using all his voluntary will, this true herald of a new era shall walk autonomously all the way to center field, and once there, kick a ball to deliver the official start of the World Cup.
In essence, what we propose is that, in the land that invented the “beautiful game," the opening kickoff of the greatest sports event in the world becomes a scientific “Gol” to all of humanity.
Editor’s Note: Ed Lu is an explorer who loves mapping the unknown – whether it’s the oceans at Liquid Robotics, our neighborhoods, leading Google Advanced Projects Teams, or unveiling the secrets of the inner solar system and saving the world with the B612 Foundation, where he serves as CEO. A NASA Astronaut, he’s flown three missions, logging 206 days in space to construct and live aboard the International Space Station. Watch Ed Lu’s full plan to save the world, this Sunday 2:30 P.M. E.T. on “The Next List”
By Ed Lu, Special to CNN
Today's meteor explosion over Chelyabinsk is a reminder that the Earth orbits the Sun in a shooting gallery of asteroids, and that these asteroids sometimes hit the Earth. Later today, a separate and larger asteroid, 2012 DA14, will narrowly miss the Earth passing beneath the orbits of our communications satellites. We have the technology to deflect asteroids, but we cannot do anything about the objects we don't know exist.
Discovered just one year ago by an amateur citizen observer, 2012 DA14 will fly only 17 thousand miles above Earth - the distance the Earth travels in just 15 minutes, and not much longer than many people travel on common air flights. So this truly is a close shave. In fact, 2012 DA14 will pass underneath our communications satellites as it flies by Earth.
This particular object is not large for an asteroid; it is about 160 feet across or roughly the size of an office building. It is not going to hit us on February 15, but it should serve as a wake-up call for planetary defense. Consider that just 105 years ago, an asteroid slightly smaller than this struck Earth in Siberia near Tunguska and completely flattened a forested area of 1000 square miles, an expanse larger than New York City or Washington D.C.
2012 DA14 is what is known as a near-Earth asteroid because its orbit crosses Earth’s orbit and it may therefore someday run into Earth. Millions of these asteroids exist, we just can’t see them from Earth. Of the million asteroids as large as or larger than 2012 DA14, we have only tracked less than 10,000. That we knew ahead of time that 2012 DA14 would buzz by Earth is really only a matter of luck. Ninety nine percent of the time we are oblivious to such impending flybys, simply because we currently don’t have the means to map and track the other 99 percent.
We established the non-profit B612 Foundation to protect humanity from asteroid impacts and, at the same time, open space to future exploration. Our Sentinel Mission is an infrared space telescope that we will launch and place in orbit around the Sun. From its vantage point looking back at Earth’s orbit, Sentinel will discover, map and track the trajectories of asteroids whose orbits approach Earth and threaten humanity. We will be the first privately funded, launched and operated interplanetary mission, and the most ambitious private space mission in history.
The Sentinel Map will give us decades of advance notice of an impending impact so that deflection becomes relatively easy. There are several promising technologies including kinetic impactors, gravity tractors and nuclear standoff explosions. The urgency in completing the map arises because there could be an impact in the next few decades. With only a few years' notice, the task of deflecting an asteroid becomes extremely difficult, to the point where it could become almost impossible (depending on the size of the asteroid) using current technology. Every year delayed in completing Sentinel increases our chances of having no available options. Why take this risk?
The chances in 90 years (roughly your lifetime) of Earth being hit by another asteroid like at Tunguska is about one in three. Shouldn’t we know in advance of the next asteroid impact, and actually prevent it?
By David Lang, Special to CNN
With the presidential election coming up, it’s hard to go anywhere without hearing an opinion on the candidates' plan for the economy, specifically manufacturing jobs.
Despite all the bad economic news the past few years, I couldn’t be more excited about the potential for manufacturing in this country.
My optimism isn’t based on macroeconomic reports or expert opinions. In fact, it’s entirely personal. Our OpenROV, an open-source underwater robot, recently raised over $110,000 on Kickstarter, the crowdfunding platform for creative projects. This experience has jump-started our micro-manufacturing company as we move into a small facility in Berkeley, California.
For me, this project has become a surprising new career. After unexpectedly losing my job last year, I was forced to rethink my entire life direction. I came to the stark realization that all I was qualified to do was sit in front of a computer screen. Instead of scrambling back into the rat race and trying to find another job behind a desk, I decided to focus on a more fundamental skill set: actually making things.
In the months after being laid off, I immersed myself in the growing Maker Movement by spending two months taking every class I could at TechShop, a members-only workshop in San Francisco. I took woodworking, laser cutting, welding, computer-aided design and manufacturing, and everything in between.
Rather unexpectedly, the side project I had begun with my friend Eric Stackpole began to gain momentum, and now the project is a full-time job.
The better news is that my "Zero to Maker" story is becoming more common than people realize. New tools and machines - 3-D printers, laser cutters, open-source micro-controllers - along with online communities and maker spaces are democratizing the means of production.
Now, anyone with an idea can quickly prototype an idea. And websites like Kickstarter make it easy for those ideas to develop a community of supporters (these ideas are beautifully articulated in Chris Anderson’s new book, "Makers").
Our OpenROV project, for example, has professional and amateur ocean engineers contributing from all over the world.
It doesn’t take an engineering background or industrial design degree to join the new maker economy. Everything I’ve learned, from how to use the machines to setting up a micro-manufacturing operation, has been on-the-spot and just-in-time. There’s a community of makers ready to get you up to speed. It’s an affordable and accessible way to re-skill yourself. We’re all learning together.
When I started, I worried I was at a permanent disadvantage because I had never used a soldering iron. Now, I'm worrying about how we're going to fill all the orders for our robots. It's a much better problem to have.
By John D. Sutter, CNN
(CNN) - Watching the Olympics, which kick off in earnest Friday with the opening ceremony in London, is more fun when you know the stories behind the Games.
No doubt, sports broadcasters will hammer on plenty of rags-to-riches, against-the-odds backstories about the Olympic athletes. (You can also find plenty of them on CNN's London 2012 page). And that's all good. But knowing the technological underpinnings of the Games is perhaps just as intriguing.
Here's a quick look at 10 of the most interesting tech stories to watch at the London Olympics: