Quantcast
Channel: video – Robohub
Viewing all 284 articles
Browse latest View live

ShanghAI Lectures 2012: Lecture 7 “Collective Intelligence: Cognition from interaction”

$
0
0

ShanghAIGlobeColorSmall

In the 7th part of the ShanghAI Lecture series, Rolf Pfeifer talks about collective intelligence. Examples include ants that find the shortest path to a food source, robots that clean up, and birds that form flocks. In the guest lecture, István Harmati (Budapest University of Technology and Economics, Hungary) discusses the coordination of multi-agent robotic systems.

The ShanghAI Lectures are a videoconference-based lecture series on Embodied Intelligence run by Rolf Pfeifer and organized by me and partners around the world.

 

István Harmati: Coordination of Multi-Agent Robotic Systems

Control and coordination of multi-agent autonomous systems plays an increasing role in robotics. Such systems are used in a variety of applications including finding and moving objects, search and rescue, target tracking, target assignments, optimal military maneuvers, formation control, traffic control and robotic games (soccer, hockey). Control and coordination are often implemented at different levels. On the highest level, robot teams are given a global task to perform (e.g attacking in robot soccer). Since it is hard to find and optimal solution to such global challenges, the methods presented are based mainly on heuristics and artificial intelligence (fuzzy systems, value rules, etc). On the middle level, the individual robots are given tactics to reach a global goal. Examples include kicking the ball to the goal or occupying a strategic position in the field. On the lowest level, the robot is controlled to perform the desired behavior (specified by the strategy and the tactics). Finally, we also show the main issues and the general ideas related to the efficient coordination of multi-agent systems.

 

Related links:


ShanghAI Lectures 2012: Lecture 8 “Where is human memory?”

$
0
0

ShanghAIGlobeColorSmall

In this 8th part of the ShanghAI Lecture series, Rolf Pfeifer looks into differences between human and computer memory and shows several types of “memories”. In the first guest lecture, Vera Zabotkina (Russian State University for the Humanities) talks about cognitive modeling in linguistics; in the second guest lecture, José del R. Millán (EPFL) demonstrates a brain-computer interface.

The ShanghAI Lectures are a videoconference-based lecture series on Embodied Intelligence run by Rolf Pfeifer and organized by me and partners around the world.

 

Vera Zabotkina: Cognitive modeling in linguistics: conceptual metaphors

The concepts that govern our thought are not just matters of the intellect. They also govern our everyday functioning, down to the most mundane details. Our concepts structure what we perceive, how we get around in the world, and how we relate to other people. Our conceptual system thus plays a central role in defining our everyday realities. If we are right in suggesting that our conceptual system is largely metaphorical, then the way we think, what we experience, and what we do every day is very much a matter of metaphor… (Lakoff & Johnson, 1980)

In this lecture, Vera addresses the integration challenge facing cognitive science as an interdisciplinary endeavor. She highlights the interconnection between AI and Linguistics and discusses conceptual metaphors.

 

José del Millán: Brain-Computer Interfacing
In this lecture, José del R. Millán (EPFL) demonstrates the use of human brain signals to control devices, such as wheelchairs, and interact with our environment.

Related links:

Ryan Calo on spyware for your brain

$
0
0

Ryan Calo discusses how researchers at Oxford, Geneva, and Berkeley have created a proof of concept for using commercially available brain-computer interfaces to discover private facts about today’s gamers.

ShanghAI Lectures 2012: Lecture 9 “Ontogenetic development”

$
0
0

ShanghAIGlobeColorSmall

In the 9th part of the ShanghAI Lecture series, we look at ontogenetic development as Rolf Pfeifer talks about the path from locomotion to cognition. This is followed by two guest lectures: The first one by Ning Lan (Shanghai Jiao Tong University, China) on cortico-muscular communication in the nervous system, the second by Roland Siegwart (ETH Zurich) on the design and navigation of robots with various moving abilities.

The ShanghAI Lectures are a videoconference-based lecture series on Embodied Intelligence run by Rolf Pfeifer and organized by me and partners around the world.

 

Ning Lan: Cortico-Muscular Communication of Movement Information by Central Regulation of Spindle Sensitivity

Ning Lan is Professor at the Shanghai Jiao Tong University in China. He tells us about cortico-muscular communication in the nervous system.

 

Roland Siegwart: Design and Navigation of Wheeled, Running, Swimming and Flying Robots

Roland Siegwart is professor at ETH Zurich and the director of the Autonomous Systems Laboratory.

Robots are rapidly evolving from factory work-horses, which are physically bound to their work-cells, to increasingly complex machines capable of performing challenging tasks as search and rescuing, surveillance and inspections, planetary exploration or autonomous transportation of goods. This requires robots to operate in unstructured and unpredictable environments and various terrains. This talk will focus on design and navigation aspects of wheeled, legged, swimming and aerial robots operating in complex environments.

Siegwart presents wheeled inspection robots designed to crawl into machines and take various measurement, quadruped walkers that exploit natural dynamics and serial elastic actuation, swimming robots and autonomous micro-helicopters used to inspect cluttered, GPS denied cities or narrow indoor environments. He also presents a small fixed-wing airplane capable of staying in the air indefinitely due to its solar powered generator.

Related links:

ShanghAI Lectures 2012: Lecture 10 “How the body shapes the way we think”

$
0
0

ShanghAIGlobeColorSmall

This concludes the ShanghAI Lecture series of 2012. After a wrap-up of the class, we announce the winners of the EmbedIT and NAO competitions and end with an outlook of the future of the ShanghAI Lectures.

Then there are three guest lectures: Tamás Haidegger (Budapest University of Technology and Economics) on surgical robots, Aude Billard (EPFL) on how the body shapes the way we move (and how humans can shape the way robots move), and Jamie Paik (EPFL) on soft robotics.

The ShanghAI Lectures are a videoconference-based lecture series on Embodied Intelligence run by Rolf Pfeifer and organized by me and partners around the world.

 

Tamás Haidegger: Human Skills for Robots: Transferring Human Knowledge and Capabilities to Robotic Task Execution in Surgery

Almost 90 years ago, the idea of telesurgery was born, along with the initial concept of robots. From the early 1970s, researchers were focusing on robotic telepresence, to empower surgeons to treat patients at a distance. The first systems appeared over 20 years ago, and robotic surgery has quickly become a standard-of-care for certain procedures—at least in the USA. Over the decades, the control concept remained the same; a human surgeon guiding the robotic tools based on real-time sensory feedback. However, from the beginning of the development, the more exciting (and sometimes frightening) questions have been linked to machine learning, AI and automated surgery. In the true sense of automation, there have only been unclear reports of one single robotically planned and executed surgery so far, despite the fact that many research groups are working on the problem. This talk introduces the major efforts currently undertaken in centers of excellence around the globe to transfer the incredibly diverse and versatile human cognition into the domain of surgical robotics.

References

  • P. Kazanzides, G. Fichtinger, G. D. Hager, A. M. Okamura, L. L. Whitcomb, and R. H. Taylor, “Surgical and Interventional Robotics: part I,” IEEE Robotics and Automation Magazine (RAM), vol. 15, no. 2, pp. 122–130, 2008.
  • G. Fichtinger , P. Kazanzides, G. D. Hager, A. M. Okamura, L. L. Whitcomb, and R. H. Taylor, “Surgical and Interventional Robotics: part II,” IEEE Robotics and Automation Magazine (RAM), vol. 15, no. 3, pp. 94–102, 2008.
  • G. Hager, A. Okamura, P. Kazanzides, L. Whitcomb, G. Fichtinger, and R. Taylor, “Surgical and Interventional Robotics: part III,” IEEE Robotics and Automation Magazine (RAM), vol. 15, no. 4, pp. 84–93, 2008.
  • C. E. Reiley, H. C. Lin, D. D. Yuh, G. D. Hager. “A Review of Methods for Objective Surgical Skill Evaluation,” Surgical Endoscopy, vol. 25, no. 2, pp. 356–366, 2011.

 

Aude Billard: How the body shapes the way we move and how humans can shape the way robots move

In this lecture Aude Billard advocates that it is advantageous to have robots move with a dynamics that resembles the dynamics of motion of natural bodies, even if the robots do not resemble humans in their physical appearance (e.g. industrial robots). This will make their motion more predictable for humans and hence make the interaction safer. She then briefly presents current approaches to modeling the dynamics of human motion in robots.

A survey of issues on robot learning from human demonstration can be found at:
http://www.scholarpedia.org/article/Robot_learning_by_demonstration

 

Jamie Paik: SOFT Robot Challenge and 
Robogamis

Making a WALL-E robot

$
0
0

We take you now to sunny, southern California, where a small group of enthusiasts has constructed a very realistic, Arduino-based replica of Pixar’s WALL-E, entirely from custom-fabricated parts.

The beloved Wall-E robot was just computer generated graphics in the Pixar movie, but fans have spent years trying to bring him to life. We visit Mike McMaster’s workshop to see his incredible life-size Wall-E, a remote controlled robot that lives among an R2-D2 droid and other pets on Mike’s orange farm.

Live Coverage of IROS Workshop on Science Communication

$
0
0

Don’t miss our live coverage on “Understanding Robotics and Public Opinion: Best Practices in Public Science Communication and Online Dissemination” at the IROS conference in Tokyo. LIVE NOW  November 7 from 9am to 12:30pm (JST).

Check back to this post during the workshop for the latest live stream. Do you have questions for the panel? Use hashtag #irosSciCom or post in the discussion below.

 

Program

Science communication in robotics
9am-10am

Online tools to stay informed and expand your reach (Sabine Hauert, Massachusetts Institute of Technology)

Robots for people who know nothing about robots (Evan Ackerman, IEEE Spectrum)

How to pitch your project to investors (Andra Keay, Robot LaunchPad & Silicon Valley Robotics)


Robots in the public discourse
10am-10:30am

A swarm on every desktop: Lessons on crowdsourcing swarm manipulation
Aaron Becker and Chris Ertel, Rice University

Public attitudes towards robots and outreach to the public through the European Union-funded programme on Cognitive Systems and Robotics
Cécile Huet, European Commission

10:30am – 11am
Break

11am-11:30am
Overview of the #robotsandyou conversation

Communicating the history of robotics through the voices of roboticists: the oral history of robotics project
Peter Asaro, School of Media Studies, The New School
Selma Sabanovic, School of Informatics and Computing, Indiana University Bloomington
Staša Milojevic, School of Library and Information Science, Indiana University Bloomington
Sepand Ansari, School of Media Studies, The New School


Driving the discussion around robotics
11:30am-12:30am

Panelists:
Peter Asaro, The New School
Ryan Calo*, University of Washington
Travis Deyle*, Hizook
Dario Floreano, EPFL
Cécile Huet, European Commission
Chris Mailey, Association for Unmanned Vehicle Systems International
AJung Moon*, University of British Columbia & Roboethics Info Database
Bruno Siciliano*, Università degli Studi di Napoli Federico II
Alan Winfield*, University of the West of England

* online participants

Organizers:
Sabine Hauert, MIT, USA
Bruno Siciliano, UNINA, Italy
Markus Waibel, ETH Zurich, Switzerland

Robot Holiday Video 2013: Autonomous Systems Lab, ETH Zurich

$
0
0
Autonomous-Christmas-Lab-2013-Web
The Autonomous Systems Lab at ETH Zurich proudly presents this year’s Robotics Christmas Video:

Movement at night and a stolen tree, who the heck let the robots free? They gather and cheer, could Christmas be here?

 

 


DARPA Robotics Challenge Trials live broadcast

$
0
0

The time has come for the robots competing on the DARPA Robotics Challenge (DRC) to make their first public appearance. The trials for the final 2014 event will take place on December 20-21, 2013 and you can watch them live (or up close if you are lucky!). As stated on DARPA’s website:
“The Trials will provide a baseline on the current state of robotics and determine which teams will continue on to the DRC Finals in 2014 with continued DARPA funding. Competing in the 2014 Finals will lead to one team winning a $2 million prize.“
Click after the jump for the live video and twitter stream or go directly to http://www.theroboticschallenge.org/.

Live Stream

Day One

Day Two

drc
 

 
[ UPDATE 2 ]: This is the final results list with the points gathered by each team. You can also read the extensive coverage from Automaton blog here:
http://spectrum.ieee.org/automaton/robotics/humanoids/darpa-robotics-challenge-trials-results
and of course on the official website:
http://www.theroboticschallenge.org/

[ UPDATE 1 ]: coverage from Automaton blog:
http://spectrum.ieee.org/automaton/robotics/humanoids/darpa-robotics-challenge-watch-it-live


Click here for the trials event agenda (pdf)

This is a general overview of the arena and the tasks the robots need to perform, you can find more on the official website.

ChallengeTasksV6 Hi-Res

Cubli – A cube that can jump up, balance, and walk across your desk

$
0
0

new_cubli

Update: New video of final robot! My colleagues at the Institute for Dynamic Systems and Control at ETH Zurich have created a small robotic cube that can autonomously jump up and balance on any one of its corners .


Update

This latest version of the Cubli can jumping up, balanc, and even “walk”. This new version is self contained with respect to power and uses three slightly modified bicycle brakes instead of the metal barriers used in the previous version. We are currently developing learning algorithms that allow the Cubli to automatically learn and adjust the necessary parameters if a jump fails due to the deterioration of the brakes and changes in inertia, weight, or slope of the surface.

This robot started with a simple idea:

Can we build a 15cm sided cube that can jump up, balance on its corner, and walk across our desk using off-the-shelf motors, batteries, and electronic components?

There are multiple ways to keep a cube in its balance, but jumping up requires a sudden release of energy. Intuitively momentum wheels seemed like a good idea to store enough energy while still keeping the cube compact and self-contained.

Furthermore, the same momentum wheels can be used to implement a reaction-torque based control algorithm for balancing by exploiting the reaction torques on the cube’s body when the wheels are accelerated or decelerated.

 

Can this work?

The first step in creating the robot, therefore, was to look at the physics to figure out if a jump-up based on momentum wheels was possible. The image below shows some of the math to figure out the Moment of Inertia (MOI) of the wheel and full cube.

RaffNotes_Cubli

This mathematical analysis allowed a quantitative understanding of the system which allowed to inform design choices, such as the trade offs of using three momentum wheels vs. a design with a momentum wheel mounted on each of the six inner faces of the cube.

Another outcome of this analysis was a good understanding of the required velocities of the momentum wheel to allow the cube to jump up, and the torques required to keep the cube in balance. Both factors were critical for the next steps: Determining the required hardware specs.

 

Specs and hardware design
Given the required velocities and torques determined above, it was clear that the momentum wheel’s motor and gearbox would be a major challenge for creating the robot. Using the mathematical model allowed to systematically tackle this problem by allowing a quantitative analysis of the trade-offs between higher velocities (i.e., more energy for jump-up) and higher torques (i.e., better stability when balancing).

Cubli_IROS2012_Page_1_Image_0001

This mathematics-driven hardware design resulted in detailed specs for the robot’s core hardware components (momentum wheels, motors, gears, and batteries) and allowed a CAD design of the entire system.

 

 

Part of this step was the design of a special brake to suddenly stop a momentum wheel to transfer its energy to the entire cube and cause it to jump up.

Cubli_IROS2012_Page_3_Image_0001

The photo to the left shows an early design of this brake, consisting of a screw mounted on the momentum wheel, a servo motor (shown in black) to move a metal plate (in blue) into the screw’s path (in light brown), and a mounting bracket (in light brown) to transfer the momentum wheel’s energy to the cube structure. The current design uses a combination of hardened metal parts and rubber to reduce peak forces.

 

2D prototype
To validate the mechanics and electronics of the jump-up and balancing strategy and prove feasibility of the overall concept a one-dimensional version was built:

The results obtained with this 2D version of the cube were published in an IROS 2012 paper.

 

The final robot
Following successful tests with the 2D version, a full robot was built. The result is Cubli, a small cube-shaped robot, named after the Swiss-German diminutive for “cube”.

As you can see in the video, Cubli can balance robustly.

However, first jump-up tests showed that the stress resulting from a sudden braking of the momentum wheel led to mechanical deformations of the momentum wheels and aluminium frame. This made repeated jump-ups of the whole Cubli impossible without part replacement. It was therefore decided to tweak the structure and breaking mechanism to reduce the mechanical stress caused by the jump-up.

In addition to balancing, my colleagues are now investigating the use of controlled manoeuvres of jumping up, balancing, and falling over to make the Cubli walk across a surface.

Cubli - standing up

Key infos
Robot name: Cubli
Researchers: Mohanarajah Gajamohan, Raffaello D’Andrea
Mechanical design: Igor Thommen
Websites: http://www.idsc.ethz.ch/Research_DAndrea/Cubli, http://raffaello.name/cubli
Status: Ongoing research project
Last update: March December 2013
Note 1: This post is part of our Swiss Robots Series. If you’d like to submit a robot to this series, or to a series for another country, please get in touch at info[@]robohub.org.
Note 2: This work was done at the Institute for Dynamic Systems and Control, ETH Zurich, Switzerland and was funded in part by the Swiss National Science Foundation (SNSF), grant number 146717.
Note 3: If you have questions, post them below and we’ll post answers.

Some more photos:
Cubli-Balancing-Robot

Cubli_IROS2012_Page_5_Image_0001

Cubli_IROS2012_Page_5_Image_0002

Swiss Robots - Cubli

Thanks Gajan!

If you liked this article, you may also be interested in:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.

Robot Holiday Videos 2013

$
0
0

Our robot holiday videos for 2012 were such a hit that we decided to do it again this year. Here is the list of robot holiday videos 2013. Thank you for spreading the holiday spirit!

From Aldebaran Robotics:
Happy New Year 2014!

From Michael McMaster:
Merry Christmas from our droids, to yours!

From the LASA lab at EPFL:
The LASA robots secretly team up with Santa to organize the Christmas gifts! Happy Holidays from the Learning Algorithms and Systems Lab!

From the Ascending Technologies team:
Can you imagine drones delivering X-mas presents? We can – with a smile.

From the Networked and Embedded Systems Lab:
What Christmas is all about: fun, family, and robots! ;-)

The Autonomous Systems Lab at ETH Zurich proudly presents this year’s Robotics Christmas Video:

Movement at night and a stolen tree, who the heck let the robots free? They gather and cheer, could Christmas be here?

Watch Zero Robotics SPHERES Challenge live

$
0
0

Astronaut-Chris-Cassidy-and-SPHERES
NASA, MIT and DARPA will host the Fifth Annual Zero Robotics SPHERES Student Challenge today at 7:30am (EST). The event will take place at MIT’s campus in Cambridge, Mass. where student teams from the US and other countries will join NASA, ESA,MIT, DARPA, the Center for the Advancement of Science in Space, and IT consulting firm Appiro. As stated on the official press release:

“For the competition, NASA will upload software developed by high school students onto bowling ball-sized spherical satellites called Synchronized Position Hold, Engage, Reorient, Experimental Satellites, or SPHERES, which are currently aboard the International Space Station. From there, space station Expedition 38 Commander Oleg Kotov and Flight Engineer Richard Mastracchio will command the satellites to execute the teams’ flight program. “

Below you can find the link for the live broadcast

http://webcast.mit.edu/i/institute/2013-2014/zero_robotics/17jan/index.html

You can also watch it on NASA TV:



Live streaming video by Ustream


de-shield-SPHERES

SPHERES consist of 3 flying satellites on board the ISS able to test a diverse range of software and hardware. You can find more info at NASA’s website and you can also listen to our older interview of Dr. Alvar Saenz-Otero from MIT, lead scientist of the SPHERES project on Robotspodcast.

(photos by NASA)

Quadrocopter failsafe algorithm: Recovery after propeller loss

$
0
0

Drone-Failsafe-Algorithm

UPDATE 04/03/2014:

In this video update, we show that a quadrocopter can be safely piloted by hand after a motor fails, without the aid of a motion capture system. This follows our previous video, where we demonstrated how a complete propeller failure can be automatically detected, and that a quadrocopter can still maintain stable flight despite the complete loss of a propeller. 

In the earlier video, we relied on an external motion capture system to measure the quadrocopter’s position and orientation.  By moving more of the algorithm onto the vehicle, the quadrocopter can now be piloted by hand after the failure. The algorithm is executed on the quadrocopter’s onboard micro-controller, and the only sensors required are the quadrocopter’s angular rate gyroscopes. We use blinking LEDs, mounted on the quadrocopter’s arms, to indicate a virtual yaw angle, so that the pilot can control the vehicle with the same remote control commands after the failure. As an alternative to the LED system, an onboard magnetometer could be used to track the vehicle’s yaw angle. Alternatively, by using more sophisticated algorithms, the system could be made to work using only the rate gyroscopes.

ORIGINAL STORY 02/12/2013

The video in this article shows an automatic failsafe algorithm that allows a quadrocopter to gracefully cope with the loss of a propeller. The propeller was mounted without a nut, and thus eventually vibrates itself loose. The failure is detected automatically by the system, after which the vehicle recovers and returns to its original position. The vehicle finally executes a controlled, soft landing, on a user’s command.

The failsafe controller uses only hardware that is readily available on a standard quadrocopter, and could thus be implemented as an algorithmic-only upgrade to existing systems. Until now, the only way a multicopter could survive the loss of a propeller (or motor), is by having redundancy (e.g. hexacopters, octocopters). However, this redundancy comes at the cost of additional structural weight, reducing the vehicle’s useful payload. Using this technology, (more efficient) quadrocopters can be used in safety critical applications, because they still have the ability to gracefully recover from a motor/propeller failure.

Failsafe_algorithm_sequence

(A) shows the quadrocopter in normal operation. In (B) the propeller detaches due to vibrations, and the quadrocopter starts pitching over in (C) – (E). In (F) the vehicle has regained control, and is flying stably.

The key functionality of the failsafe controller is a novel algorithm that I developed as part of my doctoral research at the Institute for Dynamic Systems and Control at ETH Zurich. This new approach allows such a vehicle to remain in flight despite the loss of one, two, or even three propellers. Having lost one (or more) propellers, the vehicle enters a continuous rotation — we then control the direction of this axis of rotation, and the total thrust that the vehicle produces, allowing us to control the vehicle’s acceleration and thus position.

Even if the vehicle can no longer produce sufficient thrust to support its own weight, this technology would still be useful: one could, for example, try to minimize the multicopter’s velocity when it hits the ground, or steer the multicopter away from dangerous situations such as water, or people on the ground.

This control approach can also be applied to design novel flying vehicles — we will be releasing some related results soon.

This technology is patent pending.

For more information, have a look at the Flying Machine Arena website, the IDSC research page, or just post your question in the comments below.

 

If you liked this article, you may also be interested in:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.

No drone experts were harmed in the making of this video … Or were they?

$
0
0

subaru_drones_tif

What’s with all the quadrotors in auto advertising these days? And what do quadrotor swarms have to do with cars? Probably not much at all, but apparently associating your auto brand with high-performance quads is de rigeur. Subaru is following the lead of Lexus (which launched its quadrotor ad last November), upping the ante by having the driver of the new WRX STI engage in a pas de deux (or should we say, ’pas de plusieurs?’) with a swarm of 300 LED-lit quadrotors. It makes for some pretty stunning footage, but before you get too excited, unlike the original Lexus ad (which had at least a decent portion of real footage from Kmel’s impressive quads) almost all of the quadrotor eye-candy in the new Subaru ad is CGI. The automaker’s desire to associate themselves with cutting edge technology may be a sign of just how popular quadrotors have become, but is hyper-realistic CGI enhancement inflating consumer’s expectations of what quadrotors can actually do? (see the video below)

Quadrotors are famous for performing amazing stunts in the frontier of what we think is possible from a machine. DDB Canada/Tribal Worldwide, which is the ad agency behind this project, wanted to associate the performance and especially the maneuverability of the new Subaru WRX STi with the stunts performed in various quad videos-gone-viral.

Subaru_Ad_Making_Of_17

The production used live action captured in the Hughes Airport Hangar in Los Angeles, California (which was purchased by Youtube and converted to a 41,000-square foot mega studio) along with Big Block’s Drive-a-tron system where a car is accurately modelled – both geometrically (with manufacturer’s CAD models) and dynamically (its physics and performance envelope). However the quadrotors (although modeled according to real drones) are all computer generated (note that none of the behind-the-scenes photos contain a quadrotor). The final footage is almost completely CGI and the process took 8 weeks.

Subaru_Ad_Making_Of_25

By using real-life emulating models, the production ensures that what you see on the video could have been performed in real life. That is true for the car, which is accurately modelled, but it’s speculative for the quads, even if it’s not totally far-fetched based on what has already been done (but not at that scale) by several labs like the Flying Machine Arena, or even K-Mel, the robotics company behind the Lexus drone ad.

Just how far away are we from quadrotors actually being able to perform stunts like these? Raffaello D’Andrea, the lead researcher behind the Flying Machine Arena, says “With enough of a budget, this could be done by a few groups of people now. But it is much, much cheaper to do this with CGI.”

“It definitely changes people’s expectations and is no different than the portrayal of any advanced technology in technology in movies, television, and videos. But: seeing it live is completely different; even though people may be partially “desensitized” by CGI, this immediately disappears when they see it live,” added D’Andrea, who is no stranger to giving live demos that feature his quadrotor research. “As someone who thrives on doing live demonstrations, I don’t see this as being harmful in any way.”

We also asked Alan Winfield, Hewlett-Packard Professor of Electronic Engineering at UWE Bristol and expert of the portrayal of robotics in the media his thoughts on whether CGI will inflate people’s expectations of what quads can do, and he agreed that it’s a problem. But, he adds, ”What I also find curious and interesting (in a geeky kind of way) is that the CGI in the Subaru ad faithfully reproduces the grey reflective spheres needed by the tracking system in labs – especially at ~44s. It’s ironic that the eye-candy drones include the very tracking tech that many people don’t realise is needed to make them do precision formation acrobatics.”

We can’t blame Subaru for wanting to jump on the quadrotor bandwagon … quadrotors are popular, cutting edge and hip (they are already making an entry into concept cars).  But while car enthusiasts will already have their opinions about the new Subaru WRX STi (it’s an evolution of a well-proven and popular model), we have to wonder how people’s expectations of drone technology will be shaped when they’re exposed to mostly CGI videos and advertisements rather than real footage.

In fact there is some astounding work being done with high-performance quads and aerial swarms. If you want to get a sense of what’s really doable, check out the following 2012 video, which was developed for the The Ars Electronica Futurelab with input from ETH-Zürich, the University of Pennsylvania’s GRASPlab, and the MIT-Medialab. You can find the full behind-the-scenes story about this project here.

PS. The Lexus ad mentioned above

Amazing in Motion – SWARM (Lexus Ad)

The making of “SWARM”

 

If you liked this article, you may also be interested in:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.

 

New video shows range and versatility of professional service robots

$
0
0

A new video promoting this year’s AUTOMATICA event in Munich offers an excellent primer in the state of professional service robotics in Europe. Check out the video here:


Four recent videos about robots and jobs

$
0
0

Are robots coming for your job, or are they going to make your job easier? We’ve been following this issue since we ran our Robots and Jobs focus series, and there is no doubt that it’s a topic that’s here to stay. Check out these recent videos that are contributing to the robots and jobs discussion. 

Excellent video by CPGrey about the inevitability of automation and its impact on human employment.

 

NBC clip following the recent release of the Pew Research Center’s report on robots and jobs. Check out also this audio clip from CBC radio on the same topic.

 

A new video series by Miller Weldmaster pits Man vs. Machine in various humorous scenarios. Tongue in cheek!

 

DNews discusses how robots might make our lives easier.

Video: The making of an Ant Intelligent Robot (AIR)

$
0
0

Ant Intelligent Robot (AIR) is a small and powerful mobile robot platform that is designed to be used in a heterogeneous robotic swarm that is currently under development at the Laboratory of Artificial Intelligence and Multi-Agent Systems (AI-MAS) at the University “Politehnica” of Bucharest under the supervision of Prof. Adina Magda Florea. I made this video to depict the major development stages of the project.

Some of AIR’s feature highlights are:

  • Robot application development can be done in almost any programming language (C/C++, Java, Python, Ruby, Pearl even Bash).
  • The robot has a distributed fault-tolerant hardware architecture with a state-of-the-art control and positioning system.
  • It runs a real-time Linux OS.
  • It is equipped with range and proximity sensors, a image processing module and an audio recognition system.
  • It weights approximately 0.7 Kg with a circumference of 10 cm and, like an ant, can push, pull or carry up to 9 Kg (more than 10 times its actual weight).

More info:

AI-MAS Group Website
Prof. Adina Magda Florea’s Homepage

Antares orb-3 accident

$
0
0
Orbital Science corp. Antares orb-3, pre-launch (photo: NASA)
Orbital Science corp. Antares orb-3, pre-launch (photo: NASA)

A very unfortunate incident for NASA and the commercial orbital transportation services program took place yesterday. The Antares rocket that was about to send the Cygnus spacecraft on the ISS exploded a few seconds after its launch from NASA’s Wallops flight facilities. No casualties or even small injuries were reported, although the area is being contained and treated with caution. It is a major incident for US spaceflight that breaks a trouble-free period and could have important implications for the private spaceflight sector.

The Antares launcher and the Cygnus spacecraft it was carrying were both developed and operated by Orbital Sciences corporation, one of the two private companies (along with SpaceX) under contract from NASA for supplying the International Space Station with cargo, supplies and secondary experimental, and commercial ISS payload. The video below shows that the explosion started from the base of the rocket, close to the nozzles, while the rest above was intact. The first stage engines are the major suspect but it is still too early to know exactly what went wrong.

As stated by NASA’s press conference, the flight termination system was activated. This is a remote manual intervention that initiates a self-destruction mechanism when the vehicle’s attitude is beyond recovery or control, in order to contain the damage from any explosion or debris in the launch area. As you can see in the launch video and the video below (shot from a small airplane from a distance) the explosion was huge. Even hours later, fires caused by kerosene and solid propellant were still burning.

Unlike SpaceX, which is almost completely vertically integrated, Orbital outsources most of its components and focuses on system integration and design. Antares is a two-stage rocket, and in its first stage it uses a pair of liquid propellant motors originally designed (and constructed!) during the ‘60s for the Soviet N-1 rocket. Despite their age, these motors are highly advanced, more so than most contemporary rocket engines. They were constructed as NK-33s in Russia but they are refurbished from US Company Aerojet (and renamed AJ26). It is too early to attribute the accident to these engines but it is worth mentioning that the use of Russian rocket engines from US operators recently became a very sensitive matter. The Ukraine crisis and the US/Europe sanctions are placing Orbital and ULA (the USA defense contractor for space launches that also use Russian made RD-180 engines) in an uncomfortable position. A side-effect of that crisis is a boost to the US private sector, which could step in and fill the gap created by the withdrawal of Russian engines from US launches.

(photo: Orbital Sciences corp.)
(photo: Orbital Sciences corp.)

That may be beneficial for the private sector in general but not so for Orbital, which could face delays until this mishap is resolved. Meanwhile, the competition from SpaceX and established companies like Lockheed and Boeing is very strong. During the press conference that took place shortly after the accident, NASA official stated their satisfaction from the way Orbital operates and their will to resume Antares/Cygnus missions after the accident investigation is resolved.

The COTS program, apart from its obvious mission of ISS resupply, also aims to validate the privately developed Dragon and Cygnus spacecraft. Both usually have room and payload to spare so they’re used extensively for transporting experiments. The Cygnus cargo vehicle destroyed on this accident was carrying a lot of small satellites owned by schools, universities and startups. For example, Planet labs, an earth imaging startup lost 26 small satellites that were on board. Orbital stated that the payload is insured, although the extent of the coverage and the details are different for each client and contract. The rocket and general hardware is also at least partially covered, and as stated on the press conference, NASA and Orbital will find a way to deal with the cost of any replacements.

Any correlation between payload and this accident is practically impossible, so the rules and requirements should not change for the next missions. However, Orbital’s system could take some time until it resumes the rest of its contracted missions, and the delay or additional modifications may raise the payload cost. It could also indirectly raise the payload cost for other operators simply by reducing supply. Insurance prices could also increase and the total cost for a school, research team or startup may be bigger for future missions. Nevertheless, today there is unprecedented activity over commercial spaceflight, even if we are still very far away from the glory days of nationally funded Apollo or Space Shuttle. Competition will certainly make access to space easier and cheaper for everyone, even with occassional incidents like this along the way.

Robotics then and now: IJARS interview with Peter Corke

$
0
0

In this interview, Peter Corke gives us a retrospective on the differences between the field of robotics now and when he just started his career 30 years ago, pointing out what strikes him as the most important milestones in robotics in the past 10 years. He goes on to share his view on the role of editorship, and the difference between robotics research papers and articles published in a robotics magazine, as well as his perspective on traditional publishing vs open access publishing.

He gives us a detailed overview on his latest work concerning the application of robotics in agriculture with the aim to increase food production while lowering the costs of farming efforts, the importance of further developing the sub-fields of environmental monitoring and preservation, and the advantages and disadvantages of starting a robotics labs nowadays and using social media to promote internal activities.

Prof. Corke also reveals who inspired him the most in his career (including the likes of Malcolm C. Good, Richard Paul and Rodney Brooks), as well as what fictional robot he grew up watching – we give you a hint in the form of bright red claws tagged Class M-3 Model B9.

Peter Corke, Queensland University of Technology, Brisbane, AustraliaPeter Corke lives in Brisbane with his wife and a cat. By day he’s a professor at Queensland University of Technology. By night he maintains two open-source toolboxes, one for robotics and one for vision. His interests include robotics, computer vision, embedded systems, control and networking. He worked on robotic systems for mining, aerial and underwater applications. You can learn more about his work and what he wanted to be when he grew up in this interview.

Corke P. IJARS Video Series: Peter Corke Interview with the International Journal of Advanced Robotic Systems [online video]. International Journal of Advanced Robotic Systems, 2014, 11:V1. DOI: 10.5772/59497

If you liked this post, you may also be interested in:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.

Novel robots for gait and arm rehabilitation: IJARS interview with Robert Riener

$
0
0
ARMin neuro-rehabilitation device
ARMin neuro-rehabilitation device.

In this pair of video lectures, Robert Riener presents his team’s research efforts in the field of rehabilitation robotics, and describes the vision behind Cybathlon, the competition for robot-assisted parathletes.

Novel robots for gait and arm rehabilitation

Patients who have limited use of their arms or legs due to stroke or spinal chord injury require intensive therapy to help them rehabilitate. This rehabilitation process is long, exhausting and expensive, and requires intensive therapy with specialists. With robotic technology, many of these challenges can be overcome.

Systems such as the Robot-Aided Gait Training enable patients to start intensive therapy right away, and for longer hours. The ARMin neuro-rehabilitation device helps stroke patients regain physical capabilities; research results show that patients subjected to ARMin training heal faster than patients working with therapists alone.

These devices involve intense human-robot cooperation and therefore have been developed according to human-robot interaction best practices. Future research will be focused on improving the effectiveness of robotic rehabilitation devices and their overall use in the medical field.

Riener R. IJARS Video Series: Novel Robots for Gait and Arm Rehabilitation [online video]. International Journal of Advanced Robotic Systems, 2014, 11:V2. DOI: 10.5772/59833

Cybathlon 2016, The Championship for Robot-Assisted Parathletes

Cybathlon-imageResearch labs around the world have been focused on developing novel assistive technologies such as wheelchairs, exoskeletal and mechatronic devices to help patients with spinal chord injuries in their daily life activities. There is still a long way to go before these kinds of devices will enable complete movement for these patients.

That is why Dr. Robert Riener and his team of collaborators are organising a special kind of event – Cybathlon 2016 – similar to the paralympic games. During Cybathlon, participants will compete in six disciplines while using wheelchairs, exoskeletons, brain and muscle stimulation devices, and other technologies.

The goal for researchers is to learn the strengths and the weaknesses of these devices and ultimately make them more effective for patients. The project also aims to promote wider use of robotic technology solutions in the fields of medical treatment where extensive physical therapy and assistance is required.

Riener R. IJARS Video Series: Cybathlon 2016, The Championship for Robot-Assisted Parathletes [online video]. International Journal of Advanced Robotic Systems, 2014, 11:V3. DOI: 10.5772/59837

Robert_RienerRobert Riener is Full Professor for Sensory-Motor Systems at the Department of Health Sciences and Technology, ETH Zurich. He has been Assistant Professor for Rehabilitation Engineering at ETH Zurich since May 2003. In June 2006 he was promoted to the rank of an Associate Professor and in June 2010 to the rank of a Full Professor. As he holds a Double-Professorship with the University of Zurich, he is also active in the Spinal Cord Injury Center of the Balgrist University Hospital (Medical Faculty of the University of Zurich).

His current research interests involve human motion synthesis, biomechanics, virtual reality, man-machine interaction, and rehabilitation robotics. He has authored and co-authored more than 400 peer-reviewed journal and conference articles and 20 patents. He is a member of several scientific societies (e.g., IEEE/EMBS, DGBMT/VDE, IFESS) and an associate editor of several scientific journals. For his development of the arm therapy robot ARMin, he was awarded with several prizes including the humanTech Innovation Prize and the Swiss Technology Award. He was awarded also with the IEEE TNSRE Best Paper Award 2010 and the euRobotics Technology Transfer Awards 2011 and 2012.

If you liked this post, you may also be interested in:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.

Viewing all 284 articles
Browse latest View live