Quantcast
Channel: video – Robohub
Viewing all 284 articles
Browse latest View live

Pleurobot: Multimodal locomotion in a bioinspired robot

$
0
0
The Pleurobot

The Pleurobot (Photo: Hillary Sanctuary & BioRob).

The Pleurobot is a bioinspired robot being developed by the BioRob at EPFL and NCCR Robotics. Taking it’s cues from the salamander, the Pleurobot is a walking robot that can change its gait to help it to navigate uneven terrain, and is currently learning to swim.  Watch the video to see the researchers discuss what they are doing with the Pleurobot and how they hope to improve it in future.

 

 

For further information on the Pleurobot please see:

The project’s homepage

Introducing Pleurobot (video)

 

Pleurobot_top_resized

The Year of CoCoRo Video #07/52: Lily flocking by slimemold

$
0
0
TYOC 07 52  Lily Flocking by slimemold   YouTube

The EU-funded Collective Cognitive Robotics (CoCoRo) project has built a swarm of 41 autonomous underwater vehicles (AUVs) that show collective cognition. Throughout 2015 – The Year of CoCoRo – we will be uploading a new weekly video detailing the latest stage in its development. This video shows how signal waves can be used to keep a swarm of Lily robots together as a group.

A group of Lily robots can achieve a coherent shoaling or flocking configuration by emitting and receiving pulsed light signals. Similar to slime mold or fireflies, such pulsed signals are relayed from one agent to the next, forming signal waves that move through the whole swarm. We use such waves to keep the swarm of Lily robots together as a group, to coordinate the swarm and to move it in a desired direction.

To learn more about the project, see this introductory post, or check out all the videos from the Year of CoCoRo on Robohub.

Ground-flight collaboration

$
0
0
Ground-Air Collaboration

Ground-Air collaboration between robots (Photo: RPG and Alain Herzog).

Working in the field of rescue robotics, the Robotics Perception Group (UZH and NCCR Robotics) works on how to get air robots communicating with ground robots, with the aim of exploiting the strengths of each by working in a team.  In the video below, student Elias Müggler explains how he is doing this.

 

 

For further information:

Robotics Perception Group website

Aerial-guided Navigation of a Ground Robot among Movable Obstacles (paper by Elias Müggler)

Autonomous flying robots – Davide Scaramuzza at TedxZurich:

Demonstration of the project, winning the Kuka innovation award 2014:

Sensory-motor tissues for soft robots

$
0
0
Sensory-Motor Tissues for Soft Robots

Sensory-Motor Tissues for Soft Robots

In this video, PhD student at LIS, EPFL and NCCR Robotics Jun Shintake explains his project “Sensory-Motor tissues for Soft Robots”.

 

For more information on this project please see:

The project’s website

New soft antagonistic actuator enables robots to fold

A foldable antagonistic actuator (academic paper)

Variable stiffness material based on rigid low-melting-point-alloy microstructures embedded in soft poly(dimethylsiloxane) (PDMS) (academic paper)

 

The Year of the CoCoRo Video #08/52: Lily emergent taxis

$
0
0
TYOC 08 52  Lily Emergent Taxis   YouTube

The EU-funded Collective Cognitive Robotics (CoCoRo) project has built a swarm of 41 autonomous underwater vehicles (AUVs) that show collective cognition. Throughout 2015 – The Year of CoCoRo – we will be uploading a new weekly video detailing the latest stage in its development. This video shows how a swarm of Lily robots can form a coherent group by exchanging light pulses among the group members, similar to how slime-mold does in biology.

By modulating the frequency of these signals the group can alter the path of the emerging blinking wave to turn the whole group towards the aggregation target. Such a target can be any form of gradient emitting source, regardless of the type of the emitted signal. We demonstrate this here by using a light source as a target.

To learn more about the project, see this introductory post, or check out all the videos from the Year of CoCoRo on Robohub.

Livestream: Rodney Brooks, Abhinav Gupta, Andrew McAfee on AI and the rise of robots

$
0
0

Livestream  Rodney Brooks  Abhinav Gupta  Andrew McAfee on AI and the rise of robots   RobohubLivestream of the Council of Foreign Relations’ Malcolm and Carolyn Wiener Annual Lecture on Science and Technology starting 02/27/2015 at 12:45 EST.

The Malcolm and Carolyn Wiener Annual Lecture on Science and Technology addresses issues at the intersection of science, technology, and foreign policy. In this lecture, experts discuss artificial intelligence and robot technology, and their economic impact on industry and society over the next decade.

Speakers:

  • Rodney Brooks, Panasonic Professor of Robotics (Emeritus), Computer Science and Artificial Intelligence Lab, Massachusetts Institute of Technology; Founder, Chairman, and Chief Technology Officer, Rethink Robotics
  • Abhinav Gupta, Assistant Research Professor, Robotics Institute, Carnegie Mellon University
  • Andrew McAfee, Principal Research Scientist and Cofounder, Initiative on the Digital Economy, Sloan School of Management, Massachusetts Institute of Technology

Presider:

  • Nicholas Thompson, Editor, NewYorker.com

The Year of CoCoRo Video #09/52: Jeff in the water current

$
0
0
TYOC 09 52  Jeff in the water current   YouTube

The EU-funded Collective Cognitive Robotics (CoCoRo) project has built a swarm of 41 autonomous underwater vehicles (AUVs) that show collective cognition. Throughout 2015 – The Year of CoCoRo – we will be uploading a new weekly video detailing the latest stage in its development. In this video we generate some water currents and turbulences by using a water hose in our outdoor pool. 

The video shows our first tests of the Jeff robot performed under such conditions (performed in summer 2013 and spring 2014). Those tests clearly indicated that the maneuverability of the robot (steering, forward drive) is strong enough to compensate for currents and drifts up to approximately 1m/sec. In those days we were very happy to realize such impressive capabilities for such a small robot, as this maneuverability is a prerequisite for the good performance in an underwater swarm under out-of-the lab conditions (large outside tanks, lazy river arms, ponds, lakes).

To learn more about the project, see this introductory post, or check out all the videos from the Year of CoCoRo on Robohub.

The Year of CoCoRo Video #10/52: Feeding Jeff with magnets

$
0
0
TYOC 10 52  Feeding Jeff with magnets   YouTube

The EU-funded Collective Cognitive Robotics (CoCoRo) project has built a swarm of 41 autonomous underwater vehicles (AUVs) that show collective cognition. Throughout 2015 – The Year of CoCoRo – we will be uploading a new weekly video detailing the latest stage in its development. In this video we test Jeff’s ability to pick up a target (the magnet) while swimming in a current.

In this video we again generate water currents and turbulences by using a water hose in our outdoor pool. By radio-frequency control we navigate a Jeff robot remotely in those currents. To test the precision of the steering under these conditions we use some small magnets in our hands as targets that have to be picked up by the Jeff robot. Thus it looks like we feed the robot with magnets :-)

To learn more about the project, see this introductory post, or check out all the videos from the Year of CoCoRo on Robohub.


The Year of CoCoRo Video #11/52: Out of the lab in Livorno

$
0
0

TYOC 11 52  Livorno Out Of Lab   YouTubeThe EU-funded Collective Cognitive Robotics (CoCoRo) project has built a swarm of 41 autonomous underwater vehicles (AUVs) that show collective cognition. Throughout 2015 – The Year of CoCoRo – we will be uploading a new weekly video detailing the latest stage in its development.This video summarizes all out-of-the-lab activities in which we tested our robots (Lily and Jeff).

There will be more detailed videos following throughout this year showing some of those activities more specifically. The activities shown here show autonomous robots in larger outdoor pools, ponds, lakes, rivers and ocean harbours. They act there either alone (e.g. as autonomous underwater camera agents) or in smaller groups (swarms).

To learn more about the project, see this introductory post, or check out all the videos from the Year of CoCoRo on Robohub.

The Year of CoCoRo Video#12/52: Fun with Jeff in the pool

$
0
0
TYOC 12 52  Fun with Jeff in the pool   YouTube

The EU-funded Collective Cognitive Robotics (CoCoRo) project has built a swarm of 41 autonomous underwater vehicles (AUVs) that show collective cognition. Throughout 2015 – The Year of CoCoRo – we will be uploading a new weekly video detailing the latest stage in its development. One day in early 2014, at one of those long workshops, we stayed at a hotel that had a large and deep outside pool. It was a nice day in Italy, so what else could we do than taking a JEFF robot in autonomous driving mode to this pool and have some fun with it?

It didn’t take long and more and more hotel guests gathered and watched. Special applause to Vega and Finn, the kids of the project coordinator, for helping with the filming (above and under the water), catching the robot and rescuing it sometimes from the ground after we pushed the robot beyond its limits.

To learn more about the project, see this introductory post, or check out all the videos from the Year of CoCoRo on Robohub.

The Year of CoCoRo Video #13/52: Lilycam in nature

$
0
0
TYOC 13 52 Lilycam in Nature   YouTube

The EU-funded Collective Cognitive Robotics (CoCoRo) project has built a swarm of 41 autonomous underwater vehicles (AUVs) that show collective cognition. Throughout 2015 – The Year of CoCoRo – we will be uploading a new weekly video detailing the latest stage in its development. In this video, the Lily robot gives us an underwater view of a fishing pond.

Lily is an autonomously diving robot. After we attached an underwater camera onto it, it became an autonomously driving camera agent. We used this “Lily-Cam” to look into the little fishing ponds that we have at our Zoological Department. The look below the surface offered a fascinating glimpse of the underwater world, including algae forests, fish and other water organisms.

To learn more about the project, see this introductory post, or check out all the videos

Quadrotor automatically recovers from failure or aggressive launch, without GPS

$
0
0
Credit: Robotics & Perception Group, University of Zurich.

Photo credit: Robotics & Perception Group, University of Zurich.

When a drone flies close to a building, it can temporarily lose its GPS signal and position information, possibly leading to a crash. To ensure safety, a fall-back system is needed to help the quadrotor regain stable flight as soon as possible. We developed a new technology that allows a quadrotor to automatically recover and stabilize from any initial condition without relying on external infrastructure like GPS. The technology allows the quadrotor system to be used safely both indoors and out, to recover stable flight after a GPS loss or system failure. And because the recovery is so quick, it even works to recover flight after an aggressive throw, allowing you to launch a quadrotor simply by tossing it in the air like a baseball.

How it works

Photo credit: Robotics & Perception Group, University of Zurich.

Photo credit: Robotics & Perception Group, University of Zurich.

Our quadrotor is equipped with a single camera, an inertial measurement unit, and a distance sensor (Teraranger One). The stabilization system of the quadrotor emulates the visual system and the sense of balance within humans. As soon as a toss or a failure situation is detected, our computer-vision software analyses the images for distinctive landmarks in the environment, and uses these to restore balance.

All the image processing and control runs on a smartphone processor on board the drone. The onboard sensing and computation renders the drone safe and able to fly unaided. This allows the drone to fulfil its mission without any communication or interaction with the operator.

The recovery procedure consists of multiple stages. First, the quadrotor stabilizes its attitude and altitude, and then it re-initializes its visual state-estimation pipeline before stabilizing fully autonomously. To experimentally demonstrate the performance of our system, in the video we aggressively throw the quadrotor in the air by hand and have it recover and stabilize all by itself. We chose this example as it simulates conditions similar to failure recovery during aggressive flight. Our system was able to recover successfully in several hundred throws in both indoor and outdoor environments.

More info: Robotics and Perception Group, University of Zurich.


References

M. Faessler, F. Fontana, C. Forster, D. Scaramuzza. Automatic Re-Initialization and Failure Recovery for Aggressive Flight with a Monocular Vision-Based Quadrotor. IEEE International Conference on Robotics and Automation (ICRA), Seattle, 2015.

M. Faessler, F. Fontana, C. Forster, E. Mueggler, M. Pizzoli, D. Scaramuzza. Autonomous, Vision-based Flight and Live Dense 3D Mapping with a Quadrotor Micro Aerial Vehicle. Journal of Field Robotics, 2015.



If you liked this article, you may also be interested in:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.

 

Deep Learning Primer

$
0
0
3D brain scans analysis by applying Deep Learning algorithms - photo : Scyfer

3D brain scans analysis by applying Deep Learning algorithms – photo : Scyfer

The technology that unlocks intelligence from big data – deep learning – is explained in this video by Max Welling, a professor at the University of Amsterdam, and a founder of the Dutch deep learning startup Scyfer.

Even though I have a software and systems background in demographic data, this short video was a necessary and easily understood primer in this new science of deep learning.

Peter Asaro: Challenges and approaches to developing policy for robots

$
0
0

regulation_policy_data_hand_manAs part of the Center for Information Technology Policy (CIPT) Luncheon speaker series, Peter Asaro gives a talk on developing policy for robots. 

Robotics stands on the cusp of an explosion of applications and wide-spread adoption. Already the development and popular use of small UAV drones is gaining momentum, self-driving cars could be market-ready in a few short years, and the next generation of fully-autonomous military drones are in development. Yet the regulatory policies necessary to ensure the social and economic benefits of these technologies are not yet in place. The FAA has struggled to devise operational regulations for small UAV drones, and has not yet addressed the privacy concerns they raise. Google has influenced state legislatures to pass laws permitting self-driving cars, yet the liability issues and insurance regulations are open questions, as are the safety requirements for these cars to interact with human drivers. And while the United Nations has begun discussions over the possible need to regulate fully autonomous weapons, the development of such systems continues at rapid pace. I will present my work on some of these issues, as well as ask whether a more comprehensive regulatory framework might be able to address the questions of ensuring public safety and privacy in the coming revolution in robotics.

The Year of CoCoRo Video #14/52: LilyCam in a Lake

$
0
0
TYOC 14 52  LilyCam in a Lake   YouTube

The EU-funded Collective Cognitive Robotics (CoCoRo) project has built a swarm of 41 autonomous underwater vehicles (AUVs) that show collective cognition. Throughout 2015 – The Year of CoCoRo – we will be uploading a new weekly video detailing the latest stage in its development. In this video, the Lily robot takes us below a lake in Austria.

After our successful application of “Lily-Cam” in our small ponds, we went further. At several lovely places at Styria (Austria) we took a look below the water surface and encountered beautiful and pittoresque landscapes, fish and even diving ducks.

To learn more about the project, see this introductory post, or check out all the videos from the Year of CoCoRo on Robohub.

When is an ice cube not an ice cube?

$
0
0

Japanese advertising agency wins award for Suntory Whisky ad campaign using CNC-milled ice cubes and a 3D printing app from Autodesk.

In 2014, Tokyo-based advertising agency TBWAHAKUHODO created an ad for Suntory Whisky which included the most gorgeous ice cubes you'll ever see. The ad won 6 awards at this year's Asia Pacific Ad Festival (AdFest) in Thailand. 

The ice cubes were created from a block of ice using a CNC router to create ice sculptures. Basically this is an inverse 3D printing technique. The CNC router required chilling to -7 degrees Celsius to prevent the ice from melting. The creation of these ice sculptures is more like 3D milling than 3D printing. Rather than building up an object additively, the 3D mill shaves the ice away to create the desired shape. Like a 3D printer, the 3D mill is connected to a computer and uses a 3D-printing app from Autodesk, Autodesk 123D.

Is nothing sacred!

Read more

The Year of CoCoRo Video #15/52: TRAILER LilyCam in a wild river

$
0
0
TYOC 15 52 TRAILER LilyCam in a wild river   YouTube

The EU-funded Collective Cognitive Robotics (CoCoRo) project has built a swarm of 41 autonomous underwater vehicles (AUVs) that show collective cognition. Throughout 2015 – The Year of CoCoRo – we will be uploading a new weekly video detailing the latest stage in its development. In this video, we take Lily-Cam into rough water.

Austria is home not only to beautiful lakes, it also has wild rivers and creeks. After “Lily-Cam” did its job in the lakes, we threw it also into a fast whitewater river. We were lucky to catch it downriver after some minutes and the robot survived this adventure. However, our engineers, who have also to constantly maintain and repair the robots, didn’t like the movie. ;-)

To learn more about the project, see this introductory post, or check out all the videos from the Year of CoCoRo on Robohub.

The Year of CoCoRo Video #16/52: JeffCam in a lake

$
0
0
TYOC 16 52  JeffCam in a Lake   YouTube

The EU-funded Collective Cognitive Robotics (CoCoRo) project has built a swarm of 41 autonomous underwater vehicles (AUVs) that show collective cognition. Throughout 2015 – The Year of CoCoRo – we will be uploading a new weekly video detailing the latest stage in its development. In this video a Jeff-Cam takes us on a trip below an Italian lake.

Jeff is much more agile and powerful then Lily. So mounting a camera ontop of an autonomous Jeff robot produced an even better autonomous camera agent. After some preliminary tests and tuning in a pool, we went to an Italian lake to see how it looks down there below the surface.

To learn more about the project, see this introductory post, or check out all the videos from the Year of CoCoRo on Robohub.

CoCoRo: New video series tracks dev’t of collective behaviour in autonomous underwater swarm

$
0
0
Cocoro

The EU-funded Collective Cognitive Robotics (CoCoRo) project comprises the largest autonomous underwater swarm in the world. Following three-and-a-half years of intensive research, CoCoRo’s interdisciplinary team of scientists from universities in Austria, Italy, Germany, Belgium, and the UK built a swarm of 41 autonomous underwater vehicles (AUVs) that show collective cognition. Swarm members are not only aware of their own surroundings, but also of their own and other vehicles’ experiences. Throughout 2015 – The Year of CoCoRo – the research team will be uploading a new weekly video detailing a new stage in its development and reporting updates here on Robohub. Check out the first ten videos below!

Overview of the CoCoRo system

The swarm consists of Jeff robots (the highlight of the project),  20 smaller (and slower) Lily robots, as well as a base station at the surface. The setting is shown in a short overview video.

In this simple form of collective self-awareness, the swarm processes information collectively, such that, as a whole it knows more than any single swarm member. The swarm not only collects information about the environment, but also about its own state. As well as staying together as a swarm, it’s also capable of navigating and diving, searching for sunken targets and communicating and processing information as a group.

Research highlights

The swarm communicates findings via a self-established bridge to the “world above”

Not only can the swarm interact with itself, it can also communicate its findings and inner state to the world above. It does this by establishing, self-maintaining and even repairing a bridge between the swarm, located on the sea bed, and a human-controlled floating station that is located on the water’s surface.

During the project, many algorithms were developed and tested with the CoCoRo prototype swarm. Future applications include environmental monitoring and oceanic search missions.

Three-layer decentralized swarm design

When designing the swarm, the scientists decided to follow the KISS principle — Keep it Simple and Stupid. A decentralized approach was chosen for the CoCoRo project because in a swarm there is no single point of failure. Swarms are robust, flexible and scalable systems that can adapt easily to changing environments. Additionally, the technical and cost requirements for a single robot are less than in non-swarm systems.

Three different layers were implemented for the cognition: individual, group and global. Single AUVs collect information, local groups share and compare it and, finally, the whole swarm makes collective decisions.

Bio-inspired algorithms build core functions

While designing the software, the scientists decided to focus on bio-inspired algorithms because they are known to be very flexible and robust. The focal animals were slime mold, fireflies, honeybees, cockroaches or fish. The algorithms were programmed in a modular way; self-organizing cognition algorithms were merged with self-organizing motion algorithms that were inspired by various animals. In this way, totally new algorithms emerged.

Three types of AUVs communicate via a novel multichannel underwater network

A swarm of 41 prototype AUVs was built for the CoCoRo project, consisting of three different types of AUVs that range in size from a human hand to a man’s foot. Depending on their activity, the robots run from two to six hours without required a charge. A unique feature of the CoCoRo project is that sensors were implemented in a combination never used before in underwater robotics. These combinations were necessary to allow the construction of a autonomous underwater swarm that can coordinate mainly through simple signalling between nearest neighbors. In this way, a totally new heterogeneous underwater swarm system was created, consisting of three different types of AUVs: “Jeff” robots that search the sea bed; a base station on the water’s surface (connected to humans); and “Lilly” robots that bridge and communicate information between the Jeff robots and the base station.

Check out our other videos below!


Introducing Jeff

The Jeff robots are extremely agile and can resist water currents of 1m/sec. They have autonomous buoyancy control, lateral and vertical motion with optimized propellers and a rich set of sensors. For all these functions, novel energy-saving methods were implemented that guarantee autonomy and durability.


Jeff in turbulent waters

Jeff’s body shape and actuation also allows it to operate in turbulent waters, as is tested in the following video with a remote-controlled Jeff. In this video it was purposely driven into the most turbulent areas to see how it would be affected by the water currents.


Feeding Jeff with magnets

Even in turbulent water, Jeff can be controlled with enough precision to reach a specific point in 3D space. This is demonstrated by holding a small magnet in the water for Jeff to pick up.


The Jeff swarm explores its habitat

The researchers have produced 20 Jeff robots, able to swarm out and search complex habitats in parallel, as shown in the following video.


Collective search by Jeff and Lily Robots

An important aspect of the project is that all three types of robots help each other in the performance of their collective task. To achieve this, the scientists took inspiration from nature and combined several algorithms to generate a new collective program for the whole swarm. An example of this is their collective search for a magnetic (metallic) target on the sea ground, shown in a pool-based scenario. In this setting, a target is marked by a metal plate and some magnets. Several blocking objects, surrogates for debris and rocks, produce a structured habitat. The Jeff robots, on the ground, first have to spot the target by doing a search and observing their magnetic sense. After the first robot finds the target, it attracts others through the use of blue-light signals. The aggregation of Jeff robots then summons a swarm of Lily robots at higher water levels, which serve to pinpoint the site to humans.


Swarm-size awareness exhibited by Lily robots

It’s important for a swarm of robots to know its own size. In CoCoRo, this was achieved by a novel algorithm inspired by slime mold amoebas; in this case, however, the chemical signals amoebas exchange were substituted in the algorithm for blue-light blinks. These are transmitted by the robots to their neighbors and propagated to other robots in a wave-like manner. This behavior was first tested on the smaller Lily robots.


Swarm-size awareness exhibited by Jeff robots

After the algorithm was successfully tested on Lily robots, it was further polished to improve its reaction speed and prediction quality before it was implemented in Jeff robots, as shown by the following video.


Flocking by slime mold

The slime mold inspired algorithm doesn’t just allow the robots to know the size of the swarm. Combined with directed motion, it can also be used to generate slime mold-like flocks of underwater robots.


Emergent taxis of robots

After adding an additional environmental light sensor, this bio-mimicking collective behavior is turned into something the scientists call “emergent taxis.” When the entire swarm of robots run uphill in a light gradient, although individual members can’t read it (having only a very rough sensor impression of the local area), the swarm turns into a kind of moving compound eye, with all the robots observing their local environmental and influencing each other. Finally, an algorithm designed for counting swarm members transforms it into an acting organ, what might even be called a “super-organism”.


Outlook

The CoCoRo project developed and tested a whole set of “enabling technologies”, mostly in pools but sometimes in out-of-lab conditions. When the project ended in late 2014, it was clear to the research team that these technologies should be further developed and brought out of the lab. As a next step in their long-term roadmap, the research team (with additional partners) started a new project called “subCULTron” to develop the next generation of swarm robots and apply them in areas of high impact: fish and mussel farms and the Venice lagoon.

New footage of CoCoRo available on YouTube each week

During the runtime of CoCoRo, the swarm developed an enormous variety of functionalities, which were presented in several demonstrators at the project’s final review. A lot of footage was recorded and many videos were produced. These films not only give a deep insight into the different features of the CoCoRo swarm as it evolved, but also provide a behind-the-scenes glimpse into the scientific work and development that went into the project. The CoCoRo team will be uploading a new video on YouTube each week until the end of the year to celebrate the “The Year of CoCoRo” – stay tuned on Robohub for regular updates.

Links

CoCoRo Homepage
CoCoRo on Youtube
CoCoRo on Facebook



If you liked this article, you may also be interested in:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.

The Year of CoCoRo Video #17/52: Lily confinement by bluelight

$
0
0
CoCoRo-17

The EU-funded Collective Cognitive Robotics (CoCoRo) project has built a swarm of 41 autonomous underwater vehicles (AUVs) that show collective cognition. Throughout 2015 – The Year of CoCoRo – we will be uploading a new weekly video detailing the latest stage in its development. In this video we use blue-light blinks to keep the swarm together and to keep it in vicinity of the moving base station.

Due to this “confinement”, the radio-controlled base-station can pull a whole swarm of Lily robots like a tail behind itself. It is important to confine the robots into specific areas in larger water bodies because the swarm requires normally a minimum connectivity among agents to work efficiently, which is achieved only with a critical minimum swarm density. Without keeping the robots in a controlled area, robots could get lost and the robot density could fall below the critical density. Thus, confinement was identified to be a critical functionality.

To learn more about the project, see this introductory post, or check out all the videos from the Year of CoCoRo on Robohub.

 

Viewing all 284 articles
Browse latest View live