Quantcast
Channel: video – Robohub
Viewing all 284 articles
Browse latest View live

The CORAL group: Interview with Joydeep Biswas and Brian Coltin

$
0
0

Our newest video interview features PhD student Joydeep Biswas, who works with Dr. Manuela Veloso’s CORAL research group, and scientist Brian Coltin, who is at NASA’s Ames Intelligent Research group since graduating from his PhD at Carnegie Mellon under Dr. Manuela Veloso’s supervision. The interview sees both scientists answering questions together in regard to CORAL’s recent projects, the types of robot task they work on and the hardware and software they use while being interviewed separately on their love for robotics and what got them to choose this career path in the first place.

CORAL lab’s research focuses on developing further the CORAL’s CoBot robots, type of robots which are aware of their perceptual, physical, and reasoning limitations and proactively ask for help from humans, for example for object manipulation actions. While Brian Colin focused on robot scheduling before transferring to NASA’s Ames Intelligent Robotics Group, Joydeep Biswas studies how the robots handle changes in a human environment. Both researchers’ general interest is to improve robotic systems’ autonomy and intelligence in the real world. Joydeep also participates in CORAL’s robot soccer project and maintains all lab’s robot hardware. At CORAL, they mainly work with CoBots, soccer playing robots, Baxters and Naos. Among the most challenging tasks they’re facing are obstacle avoidance and the need for a CoBot to rely on and autonomously ask for a human’s help when adapting to the environment. Biswas and Coltin give us their views on how these challenges are being handled and what will probably be the turning point in achieving fully autonomous robots.

Another interesting point made in the interview refers to the current state of robotics research published in today’s robotics publications. According to our interviewees, there’s quite a bit of research overlapping among research groups where researchers seem to rediscover separately over and over again the exact same challenges; from this arises the need to work more on collaborative efforts to limit the time expenditure on solving problems already tackled by other robotic labs. In this view, as underlined by Biswas, publishing more in Open Access should allow a larger audience of robotic professionals as well as anyone interested in working with the latest robotic softwares and hardwares to freely access the latest research published by labs such as CORAL. In addition to that, Biswas forwards the need to make it easier for roboticists to publish code and data (both closed binary and open data) to be shared with a wider audience who could then run it directly on their computers – an avenue to be explored by Open Access publications such as our own International Journal of Advanced Robotic Systems.

In the last part of the interview, we asked Coltin and Biswas separately about their personal challenges in pursuing their PhD. degrees and who influenced them the most in their career. You may be surprised with who they name as their favorite roboticists, and what they admire the most in their current adviser, Dr. Veloso. Less surprising is their favorite fictional robot – which we will let you discover for yourself.

Biswas J, Coltin B. IJARS Video Series: Joydeep Biswas and Brian Coltin interview – CORAL research group (Carnegie Mellon University) [online video]. International Journal of Advanced Robotic Systems, 2014, 11:V4. DOI: 10.5772/59985

If you liked this interview, you may also be interested in:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.


Holiday robot video 2014: Autonomous Christmas Lab

$
0
0

Our first holiday robot video of the season is in! From the Autonomous Systems Lab at ETH Zurich’s Institute of Robotics and Intelligent Systems:

Dear Robohub- and Robots Podcast-Team,

We are happy to share with you our latest Robotics Christmas video!

Have a holiday robot video of your own that you’d like to share? Send your submissions to info [at] robohub.org!

Holiday robot video 2014: ArtiMinds Robotics

$
0
0

From ArtiMinds Robotics:

Dear Robohub,

Our robots lights the advent wreath with matchsticks!

Have a holiday robot video of your own that you’d like to share? Send your submissions to info [at] robohub.org!

Holiday robot video 2014: GraspLab @ CITEC Bielefeld

$
0
0

From the GraspLab at CITEC Bielefeld:

Dear Robohub Team,

Here’s a Christmas-themed video from our institutes bimanual GraspLab with a catchy melody and nerdy robotic voice … 

 

Have a holiday robot video of your own that you’d like to share? Send your submissions to info [at] robohub.org!

Holiday robot video 2014: UTARI

$
0
0

From the University of Texas at Arlington Research Institute (UTARI):

Hello Robohub!

Here is our holiday video highlighting some of our robotic capabilities in a fun festive way. Enjoy the holiday from all of us here at the UT Arlington Research Institute!

 

 

Have a holiday robot video of your own that you’d like to share? Send your submissions to info [at] robohub.org!

Holiday robot video 2014: Robot Drive-In Movies

$
0
0

From Penny and Harry at Robot Drive-In Movies:

Seasons Greetings, We have been watching your xmas robot videos – they are great! Ours features Ozobot, a small line-sensing robot being sold as a game piece. To us, he is an actor! In this case, Santa!

Have a holiday robot video of your own that you’d like to share? Send your submissions to info [at] robohub.org!

Video: A day in the life of BeatBots’ Marek Michalowski

$
0
0

ConnectEd

At ConnectEd Studios we had the pleasure of visiting the workshop of Marek Michalowski, a co-founder of BeatBots. BeatBots is the robotic design studio behind the Keepon, and creates dynamic robotic characters for therapy, research, education, and entertainment.

We interviewed Michalowski about his career in an effort to inspire and educate high school students about careers in robotics. Enjoy!

Introducing Spot, a new smaller 4-legged robot from Boston Dynamics

$
0
0

vlcsnap-2015-02-10-01h48m52s164

Boston Dynamics just released a video of a new four legged robot named “Spot”.  It is an evolution along the lines of their previous four-legged robots like BigDog and Wildcat, but this one is much smaller and lighter (160lbs / 72.5kg). As usual not many details are known, but Spot is electrically powered (others had an internal combustion engine onboard) and has a prominent rotating LIDAR on top.

Spot can perform the usual Boston Dynamics trick: it can withstand a kick without tipping over in an eerie life-like manner; and it can also move slowly and accurately indoors while being able to run faster if necessary.

You can watch the video below and you can read other Boston Dynamics articles here.


Pleurobot: Multimodal locomotion in a bioinspired robot

$
0
0
The Pleurobot
The Pleurobot (Photo: Hillary Sanctuary & BioRob).

The Pleurobot is a bioinspired robot being developed by the BioRob at EPFL and NCCR Robotics. Taking it’s cues from the salamander, the Pleurobot is a walking robot that can change its gait to help it to navigate uneven terrain, and is currently learning to swim.  Watch the video to see the researchers discuss what they are doing with the Pleurobot and how they hope to improve it in future.

 

 

For further information on the Pleurobot please see:

The project’s homepage

Introducing Pleurobot (video)

 

Pleurobot_top_resized

Ground-flight collaboration

$
0
0
Ground-Air Collaboration
Ground-Air collaboration between robots (Photo: RPG and Alain Herzog).

Working in the field of rescue robotics, the Robotics Perception Group (UZH and NCCR Robotics) works on how to get air robots communicating with ground robots, with the aim of exploiting the strengths of each by working in a team.  In the video below, student Elias Müggler explains how he is doing this.

 

 

For further information:

Robotics Perception Group website

Aerial-guided Navigation of a Ground Robot among Movable Obstacles (paper by Elias Müggler)

Autonomous flying robots – Davide Scaramuzza at TedxZurich:

Demonstration of the project, winning the Kuka innovation award 2014:

Sensory-motor tissues for soft robots

$
0
0
Sensory-Motor Tissues for Soft Robots
Sensory-Motor Tissues for Soft Robots

In this video, PhD student at LIS, EPFL and NCCR Robotics Jun Shintake explains his project “Sensory-Motor tissues for Soft Robots”.

 

For more information on this project please see:

The project’s website

New soft antagonistic actuator enables robots to fold

A foldable antagonistic actuator (academic paper)

Variable stiffness material based on rigid low-melting-point-alloy microstructures embedded in soft poly(dimethylsiloxane) (PDMS) (academic paper)

 

Livestream: Rodney Brooks, Abhinav Gupta, Andrew McAfee on AI and the rise of robots

$
0
0

Livestream  Rodney Brooks  Abhinav Gupta  Andrew McAfee on AI and the rise of robots   RobohubLivestream of the Council of Foreign Relations’ Malcolm and Carolyn Wiener Annual Lecture on Science and Technology starting 02/27/2015 at 12:45 EST.

The Malcolm and Carolyn Wiener Annual Lecture on Science and Technology addresses issues at the intersection of science, technology, and foreign policy. In this lecture, experts discuss artificial intelligence and robot technology, and their economic impact on industry and society over the next decade.

Speakers:

  • Rodney Brooks, Panasonic Professor of Robotics (Emeritus), Computer Science and Artificial Intelligence Lab, Massachusetts Institute of Technology; Founder, Chairman, and Chief Technology Officer, Rethink Robotics
  • Abhinav Gupta, Assistant Research Professor, Robotics Institute, Carnegie Mellon University
  • Andrew McAfee, Principal Research Scientist and Cofounder, Initiative on the Digital Economy, Sloan School of Management, Massachusetts Institute of Technology

Presider:

  • Nicholas Thompson, Editor, NewYorker.com

Quadrotor automatically recovers from failure or aggressive launch, without GPS

$
0
0
Credit: Robotics & Perception Group, University of Zurich.
Photo credit: Robotics & Perception Group, University of Zurich.

When a drone flies close to a building, it can temporarily lose its GPS signal and position information, possibly leading to a crash. To ensure safety, a fall-back system is needed to help the quadrotor regain stable flight as soon as possible. We developed a new technology that allows a quadrotor to automatically recover and stabilize from any initial condition without relying on external infrastructure like GPS. The technology allows the quadrotor system to be used safely both indoors and out, to recover stable flight after a GPS loss or system failure. And because the recovery is so quick, it even works to recover flight after an aggressive throw, allowing you to launch a quadrotor simply by tossing it in the air like a baseball.

How it works

Photo credit: Robotics & Perception Group, University of Zurich.
Photo credit: Robotics & Perception Group, University of Zurich.

Our quadrotor is equipped with a single camera, an inertial measurement unit, and a distance sensor (Teraranger One). The stabilization system of the quadrotor emulates the visual system and the sense of balance within humans. As soon as a toss or a failure situation is detected, our computer-vision software analyses the images for distinctive landmarks in the environment, and uses these to restore balance.

All the image processing and control runs on a smartphone processor on board the drone. The onboard sensing and computation renders the drone safe and able to fly unaided. This allows the drone to fulfil its mission without any communication or interaction with the operator.

The recovery procedure consists of multiple stages. First, the quadrotor stabilizes its attitude and altitude, and then it re-initializes its visual state-estimation pipeline before stabilizing fully autonomously. To experimentally demonstrate the performance of our system, in the video we aggressively throw the quadrotor in the air by hand and have it recover and stabilize all by itself. We chose this example as it simulates conditions similar to failure recovery during aggressive flight. Our system was able to recover successfully in several hundred throws in both indoor and outdoor environments.

More info: Robotics and Perception Group, University of Zurich.

References

M. Faessler, F. Fontana, C. Forster, D. Scaramuzza. Automatic Re-Initialization and Failure Recovery for Aggressive Flight with a Monocular Vision-Based Quadrotor. IEEE International Conference on Robotics and Automation (ICRA), Seattle, 2015.

M. Faessler, F. Fontana, C. Forster, E. Mueggler, M. Pizzoli, D. Scaramuzza. Autonomous, Vision-based Flight and Live Dense 3D Mapping with a Quadrotor Micro Aerial Vehicle. Journal of Field Robotics, 2015.


If you liked this article, you may also be interested in:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.

 

Deep Learning Primer

$
0
0
3D brain scans analysis by applying Deep Learning algorithms - photo : Scyfer
3D brain scans analysis by applying Deep Learning algorithms – photo : Scyfer

The technology that unlocks intelligence from big data – deep learning – is explained in this video by Max Welling, a professor at the University of Amsterdam, and a founder of the Dutch deep learning startup Scyfer.

Even though I have a software and systems background in demographic data, this short video was a necessary and easily understood primer in this new science of deep learning.

Peter Asaro: Challenges and approaches to developing policy for robots

$
0
0

regulation_policy_data_hand_manAs part of the Center for Information Technology Policy (CIPT) Luncheon speaker series, Peter Asaro gives a talk on developing policy for robots. 

Robotics stands on the cusp of an explosion of applications and wide-spread adoption. Already the development and popular use of small UAV drones is gaining momentum, self-driving cars could be market-ready in a few short years, and the next generation of fully-autonomous military drones are in development. Yet the regulatory policies necessary to ensure the social and economic benefits of these technologies are not yet in place. The FAA has struggled to devise operational regulations for small UAV drones, and has not yet addressed the privacy concerns they raise. Google has influenced state legislatures to pass laws permitting self-driving cars, yet the liability issues and insurance regulations are open questions, as are the safety requirements for these cars to interact with human drivers. And while the United Nations has begun discussions over the possible need to regulate fully autonomous weapons, the development of such systems continues at rapid pace. I will present my work on some of these issues, as well as ask whether a more comprehensive regulatory framework might be able to address the questions of ensuring public safety and privacy in the coming revolution in robotics.


When is an ice cube not an ice cube?

$
0
0

Japanese advertising agency wins award for Suntory Whisky ad campaign using CNC-milled ice cubes and a 3D printing app from Autodesk.

In 2014, Tokyo-based advertising agency TBWAHAKUHODO created an ad for Suntory Whisky which included the most gorgeous ice cubes you'll ever see. The ad won 6 awards at this year's Asia Pacific Ad Festival (AdFest) in Thailand. 

The ice cubes were created from a block of ice using a CNC router to create ice sculptures. Basically this is an inverse 3D printing technique. The CNC router required chilling to -7 degrees Celsius to prevent the ice from melting. The creation of these ice sculptures is more like 3D milling than 3D printing. Rather than building up an object additively, the 3D mill shaves the ice away to create the desired shape. Like a 3D printer, the 3D mill is connected to a computer and uses a 3D-printing app from Autodesk, Autodesk 123D.

Is nothing sacred!

Read more

Continue reading

The Year of CoCoRo Video #05/52: Lily swarm size awareness

$
0
0

TYOC 05 52  Lily SwarmSizeAwareness   YouTube

The EU-funded Collective Cognitive Robotics (CoCoRo) project has built a swarm of 41 autonomous underwater vehicles (AUVs) that show collective cognition. Throughout 2015 – The Year of CoCoRo – we will be uploading a new weekly video detailing the latest stage in its development. This video shows how the swarm can estimate its own size.

Lily robots build swarms that change in size over time. By using a bio-inspired method of signal exchange these swarms can make reliable estimates of their own swarm size. Our Lily robots emit a pulsed signal that is relayed by other Lily robots in the swarm, just like slime mold amoebas or fireflies relay their signals in nature. Based on this simple signal exchange every member can estimate the number of other swarm members around it.

To learn more about the project, see this introductory post, or check out all the videos from the Year of CoCoRo on Robohub.

The Year of CoCoRo Video #06/52: Jeff swarm size measurement

$
0
0

TYOC 06 52  Jeff SwarmSizeMeasurement   YouTube

The EU-funded Collective Cognitive Robotics (CoCoRo) project has built a swarm of 41 autonomous underwater vehicles (AUVs) that show collective cognition. Throughout 2015 – The Year of CoCoRo – we will be uploading a new weekly video detailing the latest stage in its development. This video shows the Jeff robot using an algorithm to estimate the size of the swarm.

It is important for our robot swarm that the swarm as a whole is aware of its size. We use a bio-inspired method, called the „fireslime algorithm“ to achieve this form of collective awareness. The algorithm makes the robots to spread a one-bit signal (pulse) among the swarm members allowing them to make quite reliable and precise estimates of the size of their swarm. This video shows an advanced version of the algorithm implemented on Jeff robots.

To learn more about the project, see this introductory post, or check out all the videos from the Year of CoCoRo on Robohub.

The Year of CoCoRo Video #07/52: Lily flocking by slimemold

$
0
0

TYOC 07 52  Lily Flocking by slimemold   YouTube

The EU-funded Collective Cognitive Robotics (CoCoRo) project has built a swarm of 41 autonomous underwater vehicles (AUVs) that show collective cognition. Throughout 2015 – The Year of CoCoRo – we will be uploading a new weekly video detailing the latest stage in its development. This video shows how signal waves can be used to keep a swarm of Lily robots together as a group.

A group of Lily robots can achieve a coherent shoaling or flocking configuration by emitting and receiving pulsed light signals. Similar to slime mold or fireflies, such pulsed signals are relayed from one agent to the next, forming signal waves that move through the whole swarm. We use such waves to keep the swarm of Lily robots together as a group, to coordinate the swarm and to move it in a desired direction.

To learn more about the project, see this introductory post, or check out all the videos from the Year of CoCoRo on Robohub.

The Year of the CoCoRo Video #08/52: Lily emergent taxis

$
0
0

TYOC 08 52  Lily Emergent Taxis   YouTube

The EU-funded Collective Cognitive Robotics (CoCoRo) project has built a swarm of 41 autonomous underwater vehicles (AUVs) that show collective cognition. Throughout 2015 – The Year of CoCoRo – we will be uploading a new weekly video detailing the latest stage in its development. This video shows how a swarm of Lily robots can form a coherent group by exchanging light pulses among the group members, similar to how slime-mold does in biology.

By modulating the frequency of these signals the group can alter the path of the emerging blinking wave to turn the whole group towards the aggregation target. Such a target can be any form of gradient emitting source, regardless of the type of the emitted signal. We demonstrate this here by using a light source as a target.

To learn more about the project, see this introductory post, or check out all the videos from the Year of CoCoRo on Robohub.

Viewing all 284 articles
Browse latest View live