strawman wrote:I'd love some slacks made in that first print.
I once stayed at a Mariott Hotel that looked a lot like #2 from the top floor atrium.
#3 would be cool for a contact lens.
For 12 years, Harvard engineering professor Robert Wood has been trying to get a fly-sized drone off the ground. He and his colleagues have had to overcome issues of weight, aerodynamics of wing flapping, power supply, and figuring out how to manufacture a robot smaller than a quarter. Finally, the little robo-fly is airborne and here’s the video.
“This is the culmination of over a decade of work I’ve been trying to do to get this result,” said Wood, who runs a lab at Harvard University’s Wyss Institute for Biologically Inspired Engineering. “This is the first demonstration that you can make insect-like robots and control them in flight.”
Wood and graduate student Kevin Ma are publishing their findings in today’s issue of Science. The paper describes how they build the device, which is part of a larger “robo-bee” project to build swarms of tiny insect-like robots that can be used to hunt for missing people, spy on enemies, track toxic pollutants or even pollinate crops if real bees are wiped out by disease.
To build the tiny drone, Wood and the members of his lab had to figure out a way of cutting and folding tiny sheets of composite material, then opening them like a pop-up book to get the desired shape.
“The innovation we’ve been working on is the manufacturing,” said Ma, who is the first author on the Science paper. The robot has undergone several generations of flight and control testing before getting one that worked.
The final model weighs only 80 milligrams -- (.002 ounces) -- and flaps its wings at 120 beats per second.
The tiny robot flaps its wings with piezoelectric actuators—strips of ceramic that expand and contract when an electric field is applied. Thin hinges of plastic embedded within the carbon fiber body frame serve as joints, and a delicately balanced control system commands the rotational motions in the flapping-wing robot, with each wing controlled independently in real-time.
At a small scale, tiny changes in the flow of air around the wings can have huge effects on how the device flies, so the control system has to react quickly for the robot to remain stable.
“We’ve had robots in the past that have flown but they haven’t stabilized themselves in flight and couldn’t generate enough body torques,” Ma said. “The new design controls each wing separately. That was another huge innovation.”
Wood says the robo-bee project is resulting in a lot of spin-off technologies that can be used for other sorts of micro-manufacturing, as well as understanding biological questions about flies and bees themselves.
“We foresee a lot of technology fallout from those solutions,” he said. “If you have those devices, they can also be useful for answering open scientific questions: why is an insect wing shaped like it is? If you can make wings with insect like properties, then perhaps you could use these tools to ask functional biology or morphology questions.”
The process of building a miniature flying robot has also attracted young people into the world of science and engineering, according to Wood.
“These devices have a high coolness factor,” he said. “It’s easy to get kids excited about this. This is what you could be doing if you chose a career in science and engineering.”
John Gallagher, associate professor of computer science at Wright State University, said the paper is a good roadmap for others interested in building tiny machines.
“If we really care about small micro-manufacturing, this is a great test problem for that,” Gallagher said. “They are doing the equivalent of surgery with axes and getting away with it. It’s amazing.”
Wood said the next step is to get a tiny battery inside the robo-fly so that it doesn’t need a tethered power source. Another group at the University of Washington is working on a potential solution that would transfer power wirelessly, he said.
A fly-controlled heat-box has been designed to study operant conditioning in several studies of Drosophila. Each time a fly walks into the designated half of the tiny dark chamber, the whole space is heated. As soon as the animal leaves the punished half, the chamber temperature reverts to normal. After a few minutes, the animals restrict their movements to one-half of the chamber, even if the heat is switched off.
A Drosophila flight simulator has been used to examine operant conditioning. The flies are tethered in an apparatus that measures the yaw torque of their flight attempts and stabilizes movements of the panorama. The apparatus controls the fly's orientation based on these attempts. When the apparatus was set up to direct a heat beam on the fly if it "flew" to certain areas of its panorama, the flies learned to prefer and avoid certain flight orientations in relation to the surrounding panorama. The flies "avoided" areas that caused them to receive heat.
These experiments show that Drosophila can use operant behaviour and learn to avoid noxious stimuli. However, these responses were plastic, complex behaviours rather than simple reflex actions, consistent more with the experience of pain rather than simply nociception.
Now, the captchas provided by the site aren’t very “hard” to solve (in fact, they’re downright bad – some examples are below):
But there are many interesting parts here:
The HTML 5 Canvas getImageData API is used to get at the pixel data from the Captcha image. Canvas gives you the ability to embed an image into a canvas (from which you can later extract the pixel data back out again).
The pixel data, extracted from the image using Canvas, is fed into the neural network in an attempt to divine the exact characters being used – in a sort of crude form of Optical Character Recognition (OCR).
A few co-workers and I at NASA's Marshall Space Flight Center (MSFC) have been working on a small formation flying project over the last 9 months. Our project is to demonstrate formation flying using a decentralized mesh network first by flight testing the concept with small UASes (nearly standard 3DR quads), and then simulating the same system being used on satellites. We had a very successful demo flight last week with 5 quads flying, and I thought now would be a great time to show our results and share our work with the sUAS community.
We've been using 3DR UAS systems for several years now on several projects including an internal NASA search and rescue unmanned system competition and to provide aerial footage of test flights of the Mighty Eagle lunar lander testbed at MSFC.
For this new project, we took the standard quad frame and electronics with Pixhawks as the flight computer and added to that a Beaglebone Black to run our formation logic and a pair of XBee radios to provide the formation communication between our vehicles (which we call nodes). Each node communicates using its two XBees which are each running a separate mesh network for redundancy. The nodes exchange state information that they receive from the Pixhawks (GPS position, etc.) and then use that information to determine where they are in the formation and where they need to go. Since they all share their information, there is no one master or leader of the formation.
Users browsing this forum: No registered users and 1 guest