Showing posts with label ROBOTS. Show all posts
Showing posts with label ROBOTS. Show all posts

Saturday, March 28, 2015

soft-robotics


https://www.youtube.com/watch?v=jGuyDs1Zlaw


http://biodesign.seas.harvard.edu/soft-robotics

Multi-material fluidic actuators
Soft fluidic actuators consisting of elastomeric matrices with embedded flexible materials (e.g. cloth, paper, fiber, particles) are of particular interest to the robotics community because they are lightweight, affordable and easily customized to a given application. These actuators can be rapidly fabricated in a multi-step molding process and can achieve combinations of contraction, extension, bending and twisting with simple control inputs such as pressurized fluid. In our approach is to use new design concepts, fabrication approaches and soft materials to improve the performance of these actuators compared to existing designs. In particular, we use motivating applications (e.g. heart assist devices, soft robotic gloves) to define motion and force profile requirements. We can then embed mechanical intelligence into these soft actuators to achieve these performance requirements with simple control inputs.


Modeling of soft actuators
Characterizing and predicting the behavior of soft multi-material actuators is challenging due to the nonlinear nature of both the hyper-elastic material and the large bending motions they produce. We are working to comprehensively describe the principle of operation of these actuators through analytical, numerical and experimental approaches and characterize their outputs (motion and force) as a function of input pressure as well as geometrical and material parameters. Both models and experiments offer insight into the actuator behavior and the design parameters that affect it. We envision this work will lead to improved predictive models that will enable us to rapidly converge on new and innovative applications of these soft actuators.


Sensing and control
In order to control soft actuators, we need means of monitoring their kinematics, interaction forces with objects in the environment and internal pressure. We accomplish this through the use of fully soft sensors, developed with collaborators, and miniature or flexible sensors that can be incorporated into the actuator design during the manufacturing process. For power and control, we use off the shelf components such as electronic valves, pumps, regulators, sensors, and control boards etc. to rapidly modulate the pressure inside the chambers of the actuators using feedback control of pressure, motion and force. In addition, we can use the analytical models we develop to estimate state variables that may be difficult to measure directly.

Translational applications
There are approximately four million chronic stroke survivors with hemiparesis in the US today and another six million in developed countries globally. In addition, there are millions of other individuals suffering from similar conditions. For the majority of these cases, loss of hand motor ability is observed, and whether partial or total, this can greatly inhibit activities of daily living (ADL) and can considerably reduce one’s quality of life. To address these challenges, we are developing a modular, safe, portable, consumable, at-home hand rehabilitation and assistive device that aims to improve patient outcomes by significantly increasing the quantity (i.e. time) and quality of therapy at a reduced cost while also improving independence of users with chronic hand disabilities by enabling them to perform activities of daily living.



In the United States, the lifetime risk of developing heart failure is roughly 20%. The current clinical standard treatment is implantation of a ventricular assist device that contacts the patient’s blood and is associated with thromboembolic events, hemolysis, immune reactions and infections. We are applying the field of soft robotics to develop a benchtop cardiac simulator and a Direct Cardiac Compression (DCC) device employing soft actuators in an elastomeric matrix. DCC is a non-blood contacting method of cardiac assistance for treating heart failure involving implantation of a device that surrounds the heart and contracts in phase with the native heartbeat to provide direct mechanical assistance during the ejection phase (systole) and the relaxation phase (diastole) of the cardiac cycle.

Teaching Robots to Salsa



https://www.youtube.com/watch?v=7_kTRLt7uhc

As dancers, this couple is no Fred Astaire and Ginger Rogers. The leader’s moves are clunky, his partner’s so tentative that she’s constantly behind a beat. But be kind: they’re beginners at salsa, and they’re bedeviled by something Fred and Ginger never faced.

They’re robots.

H. Kayhan Ozcimder (ENG’11,’15), a dancer with the Boston troupe Collage, has had the inelegant experience of dancing with one of these machines, which resemble a vacuum cleaner minus the hose. Ozcimder dreams of a more agile automaton someday, but for now he’s pleased to have helped program these salsa-bots, proving that “it’s possible to do an art form in a robotic platform.”

Ozcimder is a graduate student in John Baillieul’s Intelligent Mechatronics Lab, whose mission, says the College of Engineering mechanical engineering professor, is to give machines the ability to respond to their environment. The researchers began by mapping the coordinates of actual salsa dancers and programming the robots with four basic beginner moves (relying on his dancer’s knowledge, Ozcimder suggested salsa as a simple starting point for the mechanized dance amateurs). The robots, which are outfitted with motion sensors, read each other’s moves and respond according to the programming.

Ozcimder thinks motion-reading robots might someday serve as useful tools for judging dance competitions (possibly bouncing Kirstie Alley even sooner from Dancing with the Stars), but Baillieul is hunting bigger game. He’s not out to help “some high school guy who had trouble getting a date, so you get a robot. The ultimate goal is to understand human reaction to gestures and how machines may react to gestures.” That could enable robots to team with, and perhaps take over from, humans in hazardous jobs, from treacherous rescues to repairs in lethal environments (think the workers who plunged into the stricken Fukushima Daiichi nuclear plant after the 2011 Japanese tsunami).

Boston University BU, College of Engineering ENG, robotics, teaching robots salsa dance, artificial intelligence, science research, John Baillieul Intelligent Mechatronics Lab
The next great dance team: robots that respond to each other's motions, designed by BU researchers.

The Intelligent Mechatronics Lab is littered with things from dancing robots to flight vehicles. The work builds on an established fact of 21st-century life: computing machines will do more of the work. “Everyday objects like automobiles have gone from almost entirely mechanically engineered things to being machines that are basically controlled at every level by computers,” notes Baillieul. “A typical automobile now has 100 or more microprocessors in it.”

The challenge is to build machines that can perform tasks with some autonomy and respond in fluid situations they might not have been precisely programmed for, an instance where man still has it all over machines. Whereas human reaction is the child of several parents—instinct, surely, but also the ability to learn from experience and sometimes override instinct—robots are not yet agile enough to ignore their “instinct” (programming). The solution, says Baillieul, is to give the machines sufficiently “massive experiential data sets” that they can react to numerous situations.

One avenue the lab is exploring is humans’ use of nonverbal cues to communicate. Good dancers move seamlessly together, responding to each other’s touch and motions; amateurs without experience reading each other’s cues often come off looking stilted. Nonverbal cues can also be used to send misinformation; bats, for example, camouflage their motions so that they can sneak up on insect prey, a fake-out familiar to anyone who’s tried to swat a pesky fly. Hence the lab’s work with getting robots to use sensors to read each other’s metal-body language, aimed at “how you might program flying vehicles or mobile robots to do the right thing, in terms of communicating or not communicating through their motions,” Baillieul says.

Dance companies like Ozcimder’s can rest easy; even he doesn’t foresee automating human dancers out of a job. Robots may be geniuses at detecting footwork, body angles, and other technical metrics that go into a performance, but they can’t judge the intangible artistic panache that might please an audience, like dancers’ facial expressions.

Ozcimder has bad news for our mechanized friends: intangibles make up half the judging criteria at a typical salsa competition.

http://www.bu.edu/today/2013/dances-with-robots/



https://www.linkedin.com/pub/hasan-kayhan-ozcimder/89/540/b0a
http://people.bu.edu/johnb/


Sunday, March 1, 2015

Spot


Spot is a four-legged robot designed for indoor and outdoor operation.
It is electrically powered and hydraulically actuated. Spot has a
sensor head that helps it navigate and negotiate rough terrain. Spot
weighs about 160 lbs.


http://www.wired.com/2015/02/creepy-collective-behavior-boston-dynamics-new-robot-dog/

Sunday, February 8, 2015

MahaDeviBot

https://www.youtube.com/watch?v=SHDWLqfW-ec

CommU: communication unity .... Sota: social talker



https://www.youtube.com/watch?v=TaR78QIxKSE

https://www.youtube.com/watch?v=oi8Thk8Dupc


TOKYO (AP) — The scientist behind a new talking robot in Japan says people should stop expecting robots to understand them, and instead try to chime in with robotic conversations.
Hiroshi Ishiguro's 28-centimer (11-inch) tall button-eyed Sota, which stands for "social talker," is programmed to mainly talk with a fellow robot, and won't be trying too hard to understand human speech — the major, and often frustrating, drawback of companion robots.
Sota, shown to reporters at a Tokyo museum Tuesday, goes on sale in July at under 100,000 yen ($850) each. To fully enjoy its features, one would have to buy at least two of them, although people can buy just one.
"Don't stop at just two. Please buy three or four," said Ishiguro, a professor at Osaka University, who has previously shown a variety of robots that look eerily human, including one that's his double.
Ishiguro also demonstrated a more elaborate robot CommU, which stands for communication unity. It will cost five times as much as Sota.
The news conference to introduce Sota and CommU was led by two other humanoids, which appeared to talk with the two newest additions to Japan's robot pantheon.
Robot maker Vstone, which simplified Ishiguro's research to come up with commercial products, expects to sell 3,000 Sota robots in the first year, mostly to businesses. They could be used for tasks such as drawing attention to products on display.
Japan is a leading maker of robots, and its repertoire has ranged from industrial robots to whimsical toys.
Internet and telecommunications company Softbank Corp. will start selling Pepper, a humanoid it claims is designed to read human emotions, in Japan next month for 198,000 yen ($1,700), possibly heralding the era of everyday robots here.
Ishiguro said the idea behind Sota and CommU was similar to watching chattering children. An adult joining such a conversation would have low expectations and be engaging in dialogue for the fun, he said.
CommU is designed to make eye contact with rolling eyeballs, a feature Ishiguro believes is important to make conversations feel real.
In a demonstration, one CommU said to another CommU, "Do you know Denmark?"
It replied: "I love Denmark," to which the first said, "I love Denmark, too."
Ishiguro insists the robots can do more than just agree with each other, and can be programmed to carry on various kinds of conversations, including confrontational chatter.
But the main point is that people should stop expecting robots to live up to human expectations or merely do useful chores.
"Voice recognition has always been very difficult for robots," Ishiguro said. "Human beings should instead adjust to what robots can do."



http://hosted2.ap.org/APDEFAULT/495d344a0d10421e9baa8ee77029cfbd/Article_2015-01-20-AS--Japan-Babbling%20Bots/id-c74082bbc3b04ac5a8df29d410393bfb

Robots: Videos from the past,

Although the channel is not updated anymore, it presents a good list of videos related to Robots

https://www.youtube.com/user/plasticpals1/videos

Sunday, December 7, 2014

The Unearthly History of Science Fiction: Robots

Dominic Sandbrook continues his exploration of the most innovative and imaginative of all genres and gets to science fiction's obsession with robots.

The idea of playing God and creating artificial life has fascinated us since the earliest days of science fiction - but what if our creations turn against us?

Dominic, leading writers and film-makers follow our hopes and fears from the first halting steps of Frankenstein's monster, via the threats of Doctor Who's Cybermen and The Terminator, the provocative ideas of Blade Runner and Battlestar Galactica, to the worlds of cyberspace and the Matrix, where humanity and technology merge.

Among the interviewees are Rutger Hauer (Blade Runner), actor Peter Weller (RoboCop), producer Gale Anne Hurd (The Terminator), Anthony Daniels and Kenny Baker (Star Wars), actor Edward James Olmos (Battlestar Galactica) and novelist William Gibson.


http://www.bbc.co.uk/programmes/p026c7n7

Tuesday, September 23, 2014

control software called V-Sido OS

SoftBank's subsidiary company, Asratec Corp. have produced the following ad (in Japanese) for their control software called V-Sido OS which provides real-time balance correction for humanoid robots.



Sunday, April 27, 2014

Robopocalypse Show

At Robotpocalypse Show, cyber-anchores interviewed former Wired editor-in-chief Chris Anderson and his 3D Robotics business partner Jordi Muñoz are offering up the technology to help turn robotic helicopters into autonomous drones. Next up, author Daniel H. Wilson discusses his own prophecies in the bestselling Robopocalypse. Wafaa Bilal, the NYU professor who had a camera implanted in the back of his head, share his experience about his life as an cyborg -- and what it's like having strangers on the internet shoot you with paintballs.

They also interviewed BigDog creator Marc Raibert, iRobot in Massachusetts, Bot & Dolly and Keepon-maker BeatBots in San Francisco and Willow Garage, home of the PR2. Aditionally, many projects from USA's top schools are presented such as MIT, Carnegie Mellon, Georgia Tech and Northeastern University.

Finally, they crack open a time capsule from 2012, to check out some of the top consumer electronics of the day including the Microsoft Surface, iPad Mini and Nexus 4 and 10.

Be sure to watch this very special episode of the Engadget Show. Your life -- and everything you hold dear -- just might depend on it.


Download the Show:

Hight Definition
Intermedium Definition
Low Definition


SOURCE : The Engadget Show 38: Robopocalypse with Chris Anderson ...

Sunday, March 9, 2014

Shimi: A Robot That Digs Music (as much as you)

Shimi is a robotic musical companion. He’s like your personal DJ, the guy that knows how to keep a party going and always knows what you want to hear next. Plug a smartphone into Shimi’s smartphone dock and he instantly comes to life, making every song in your library a live performance. Shimi’s premium speakers reproduce your tunes in booming, crystal clear hi-def, and Shimi follows you to make sure you’re always getting the best sound.

When he’s not listening to music, Shimi makes sure your fully immersed in your online social life, even when you’re not in front of your PC. He’ll alert you when you get a Facebook post, do a shimmy when you get a new Twitter follower, and broadcast your dance parties to your remote friends around the world.

Shimi not only loves to listen to music, he also loves to create music. He’s even got the patience to teach you guitar! Play dance games online with your friends, and even teach Shimi a few new moves.

Whatever you’re in to, Shimi makes sure you get the most out of your music.






References:
http://www.tovbot.com/t/AboutShimi
http://www.gtcmt.gatech.edu/research-projects/travis

Wednesday, February 19, 2014

Miniature Humanoid

The 20-DOF miniature humanoid “MH-2” designed as a wearable telecommunicator, is a personal telerobot system. An operator can communicate with remote people through the robot. The robot acts as an avatar of the operator. To date, four prototypes of the wearable telecommunicator T1, T2, T3 and MH-1, have been developed as research platforms. MH-1 is also a miniature humanoid robot with 11-DOF for mutual telexistence. Although human-like appearance might be important for such communication systems, it is unable to achieve sophisticated gestures due to the lack of both wrist and body motions. To tackle this problem, a 3-DOF parallel wire mechanism with novel wire arrangement for the wrist is introduced, while 3-DOF body motions are also adopted. Consequently, a 20-DOF miniature humanoid with dual 7-DOF arms has been designed and developed.





Fig. 1 A Concept of Telecommunicator

Fig. 2 MH-1 with 11-DOF

Fig. 3 MH-2 with 20-DOF






REFERENCE
http://telerobotics.yz.yamagata-u.ac.jp/

Thursday, February 13, 2014

Collective construction

Termite colonies build tremendous, complicated mounds, acting with no central control or careful advance planning. These social insects provide a fantastic proof of principle that (relatively) simple agents, acting independently with access only to local information, can build amazing things. How could we build and program robot swarms—artificial termite colonies—to build things for us? We want a human user to be able to give such a swarm a high-level description of what they want built, and have a guarantee that the system will build that thing, without the user having to get into the details of how it's done.

Read more about earlier work or the more recent TERMES project, which I co-lead with Radhika Nagpal.

Main Researcher: http://people.seas.harvard.edu/~jkwerfel/
Spanish-related article: http://www.elmundo.es/ciencia/2014/02/13/52fd18dae2704e702e8b457e.html


Tuesday, February 11, 2014

The new robot of Aldebaran Robotics

Nao, the companion robot created by Aldebaran Robotics, uses deep learning to improve its emotional intelligence, facial recognition and ability to communicate in multiple languages. The real innovation challenge it seems will not be to apply deep learning to replace humans but to use it to create new ideas, products and industries that will continue to generate new jobs and opportunities (1).

Apropos, if you happen to be interested in the background song, your hearing sense can be delighted at http://youtu.be/4VeA_9bismc



References
(1) http://robohub.org/deep-learning-creating-jobs-in-apps-wearable-tech-and-robotics/

Monday, October 7, 2013

Self-assembling robots

Small cubes with no exterior moving parts can propel themselves forward, jump on top of each other, and snap together to form arbitrary shapes.



Reference
http://web.mit.edu/newsoffice/2013/simple-scheme-for-self-assembling-robots-1004.html

Friday, July 12, 2013

Robot Revolution : Will Machines surpass humans?



Titled Robot Revolution : Will Machines surpass humans

Features Honda Asimov, Hubo, Big Dog from Boston Dynamics, Baxter from Rethink Robotics, Nextage and other Humanoids

Disclaimer : I do not own the copyright to this programme, this is a recording from NHK World and is the IP of international. I do not intend to break copyright however I feel the usefulness of the programme in inspiring innovation and improving lives is more important. Hopefully will make the content available online themselves.

Apologies for the missing 10 minutes at the end, I rushed to find an sd card for the last showing of the programme, and I already had data on it so the final 10 mintutes wasn't recorded, I remember it as being a recap of the rest of the programme.


Tuesday, July 2, 2013

Researchers Use Video Game Tech to Steer Roaches on Autopilot





North Carolina State University researchers are using video game technology to remotely control cockroaches on autopilot, with a computer steering the cockroach through a controlled environment. The researchers are using the technology to track how roaches respond to the remote control, with the goal of developing ways that roaches on autopilot can be used to map dynamic environments – such as collapsed buildings.
Click to enlarge. (Photo credit: Alper Bozkurt)
Click to enlarge. (Photo credit: Alper Bozkurt)
The researchers have incorporated Microsoft’s motion-sensing Kinect system into an electronic interface developed at NC State that can remotely control cockroaches. The researchers plug in a digitally plotted path for the roach, and use Kinect to identify and track the insect’s progress. The program then uses the Kinect tracking data to automatically steer the roach along the desired path.

“Kinect-based System for Automated Control of Terrestrial Insect Biobots”
Authors: Eric Whitmire, Tahmid Latif and Alper Bozkurt, North Carolina State UniversityPresented: July 4, 2013, 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society
Abstract: Centimeter scale mobile biobots offer unique advantages in uncertain environments. Our
previous experimentation has demonstrated neural stimulation techniques in order to control the
motion of Madagascar hissing cockroaches. These trials relied on stimulation by a human operator using a remote control. We have developed a Kinect-based system for computer operated automatic control of cockroaches. Using image processing techniques and a radio transmitter, this platform both detects the position of the roach biobot and sends stimulation commands to an implanted microcontroller-based receiver. The work presented here enables repeatable experimentation and allows precise quantification of the line following capabilities of the roach biobot. This system will help refine our model for the stimulation response of the insect and improve our ability to direct them in increasingly dynamic situations.


Remote control cockroach cyborgs








Researchers from North Carolina State University have developed a technique that uses an electronic interface to remotely control, or steer, cockroaches.
“Our aim was to determine whether we could create a wireless biological interface with cockroaches, which are robust and able to infiltrate small spaces,” says Alper Bozkurt, an assistant professor of electrical engineering at NC State and co-author of a paper on the work. “Ultimately, we think this will allow us to create a mobile web of smart sensors that uses cockroaches to collect and transmit information, such as finding survivors in a building that’s been destroyed by an earthquake.
“Building small-scale robots that can perform in such uncertain, dynamic conditions is enormously difficult,” Bozkurt says. “We decided to use biobotic cockroaches in place of robots, as designing robots at that scale is very challenging and cockroaches are experts at performing in such a hostile environment.”
Researchers were able to precisely steer the roaches along a curved line.
But you can’t just put sensors on a cockroach. Researchers needed to find a cost-effective and electrically safe way to control the roaches, to ensure the roaches operate within defined parameters – such as a disaster site – and to steer the roaches to specific areas of interest.
The new technique developed by Bozkurt’s team works by embedding a low-cost, light-weight, commercially-available chip with a wireless receiver and transmitter onto each roach (they used Madagascar hissing cockroaches). Weighing 0.7 grams, the cockroach backpack also contains a microcontroller that monitors the interface between the implanted electrodes and the tissue to avoid potential neural damage. 
The cerci are sensory organs on the roach’s abdomen, which are normally used to detect movement in the air that could indicate a predator is approaching – causing the roach to scurry away. But the researchers use the wires attached to the cerci to spur the roach into motion. The roach thinks something is sneaking up behind it and moves forward.
The wires attached to the antennae serve as electronic reins, injecting small charges into the roach’s neural tissue. The charges trick the roach into thinking that the antennae are in contact with a physical barrier, which effectively steers them in the opposite direction.



The paper, “Line Following Terrestrial Insect Biobots,” was presented Aug. 28 at the 34th Annual International Conference of the IEEE Engineering in Medicine & Biology Society in San Diego, Calif. The paper was authored by Tahmid Latif, a Ph.D. student at NC State, and co-authored by Bozkurt. Bozkurt has previously developed similar interfaces to steer moths, using implanted electronic backpacks.



REFERENCE
http://news.ncsu.edu/releases/wms-bozkurt-roach-autopilot/

Wednesday, June 26, 2013

The World's Smartest Robot




Hanson Robotics has launched a new Indigogo campaign to create "the world's smartest robot." Named Adam Z1, the robot will eventually be able to speak, play with toys, draw pictures, and respond with emotions.
And the project is no joke. Robot designer and researcher David Hanson has pooled together three other experts in this area, including OpenCog Artificial General Intelligence (AGI) project founder Ben Goertzel, roboticist Mark Tilden, and consciousness theorist Gino Yu. Hanson himself is the creator of the robots Einstein, Zeno, Robokind, Brina 48, and others.
A project to build a robot with the intelligence of a three year-old
This idea — developing a robot with a child's intelligence — is interesting for two reasons. First, an AGI needs a body. And second, an AGI develops over time and through experiences.
KurzweilAI recently spoke to Goertzel:
“My goal as you know is to create AGI with human level and ultimately greater general intelligence,” he said. “But to get there, we need to create AGIs with basic common sense understanding of the everyday human world. And the easiest way to get an AGI to have basic commonsense understanding of the everyday human world, is to give it some rough approximation of a human embodiment and let it experience the world like a human. That’s the research purpose…
“There are also shorter term practical applications, e.g., Hanson Robokind’s main intended near-term application area is for education … to use robots as an educational tool for teaching programming. Hanson Robokind robots are already being used to help teach autistic kids.
Not that long from now, full-sized humanoid robots will be in wide use as personal assistants, etc. And cheaper versions will be widespread as toys before long: think “RoboSapien with a cute face and a cloud-based mind.”
And interestingly, the team plans on making Adam Z1's software free and open source for others to improve upon.
Of course, the ethics of such a project is another thing entirely — a topic I will touch upon in a future post.

Friday, June 14, 2013

Introduction to Complex Systems: Patterns in Nature




This video provides a basic introduction to the science of complex systems, focusing on patterns in nature. (For more information on agent-based modeling, visit imaginationtoolbox.org



Anki







Founders



Boris Sofman, Ph.D., Co-Founder, CEO

As an engineer and researcher with experience in building diverse robotic systems—from consumer products to off-road autonomous vehicles and bomb-disposal robots—Boris is making it his life’s work to create products that people would not expect to be possible. Prior to founding Anki, Boris worked at iRobot and Neato Robotics. He earned a B.S., M.S. and Ph.D. from the Robotics Institute of Carnegie Mellon University. Unfortunately Anki doesn’t allow him to play tennis as often as he’d like.








Mark Palatucci, Ph.D., Co-Founder, Chief Product Officer

Mark wants to create products that truly surprise and excite people. Before co-founding Anki, Mark was the founding-CEO and principal engineer at Copera, an embedded systems and mobile software company. Prior to that, Mark held engineering and research internships at Google and Intel. He earned a B.S.E. from the University of Pennsylvania and both a M.S. and Ph.D. from the Robotics Institute at Carnegie Mellon University. Mark finds that flying airplanes gives him a thrilling excuse not to check his email.







Interfaces

Hanns Tappeiner, Co-Founder, President

Hanns is passionate about creating products he always wanted but didn’t exist. He has worked extensively at refining the connections between operator and robot, developing deeper senses of feel and control. Hanns has designed robotics across the globe for companies in Germany, Italy, Austria and the US. Hanns studied at the University of Technology in Vienna before earning an M.S. in Robotics from the Robotics Institute at Carnegie Mellon University. On the weekend, Hanns is probably working or outdoors somewhere with his motorcycle.