Quantcast
Channel: video – Robohub
Viewing all 284 articles
Browse latest View live

The Robot Academy: Lessons in inverse kinematics and robot motion

$
0
0

The Robot Academy is a new learning resource from Professor Peter Corke and the Queensland University of Technology (QUT), the team behind the award-winning Introduction to Robotics and Robotic Vision courses. There are over 200 lessons available, all for free.

The lessons were created in 2015 for the Introduction to Robotics and Robotic Vision courses. We describe our approach to creating the original courses in the article, An Innovative Educational Change: Massive Open Online Courses in Robotics and Robotic Vision. The courses were designed for university undergraduate students but many lessons are suitable for anybody, as you can easily see the difficulty rating for each lesson. Below are lessons from inverse kinematics and robot motion.

You can watch the entire masterclass on the Robot Academy website.

Introduction

In this video lecture, we will learn about inverse kinematics, that is, how to compute the robot’s joint angles given the desired pose of their end-effector and knowledge about the dimensions of its links. We will also learn about how to generate paths that lead to a smooth coordinated motion of the end-effector.


Inverse kinematics for a 2-joint robot arm using geometry

In this lesson, we revisit the simple 2-link planar robot and determine the inverse kinematic function using simple geometry and trigonometry.


Inverse kinematics for a 2-joint robot arm using algebra

You can watch the entire masterclass on the Robot Academy website.


If you liked this article, you may also enjoy:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.


SMART trials self-driving wheelchair at hospital

$
0
0

Image: MIT CSAIL

Singapore and MIT have been at the forefront of autonomous vehicle development. First, there were self-driving golf buggies. Then, an autonomous electric car. Now, leveraging similar technology, MIT and Singaporean researchers have developed and deployed a self-driving wheelchair at a hospital.

Spearheaded by Daniela Rus, the Andrew (1956) and Erna Viterbi Professor of Electrical Engineering and Computer Science and director of MIT’s Computer Science and Artificial Intelligence Laboratory, this autonomous wheelchair is an extension of the self-driving scooter that launched at MIT last year — and it is a testament to the success of the Singapore-MIT Alliance for Research and Technology, or SMART, a collaboration between researchers at MIT and in Singapore.

Rus, who is also the principal investigator of the SMART Future Urban Mobility research group, says this newest innovation can help nurses focus more on patient care as they can get relief from logistics work which includes searching for wheelchairs and wheeling patients in the complex hospital network.

“When we visited several retirement communities, we realized that the quality of life is dependent on mobility. We want to make it really easy for people to move around,” Rus says.

ep.354: Autonomous Flight Demo with CMU AirLab #ICRA2022, with Sebastian Scherer

$
0
0

Sebastian Scherer from CMU’s Airlab gives us a behind-the-scenes demo at ICRA of their Autonomous Flight Control AI. Their approach aims to cooperate with human pilots and act the way they would.

The team took this approach to create a more natural, less intrusive process for co-habiting human and AI pilots at a single airport. They describe it as a Turing Test, where ideally the human pilot will be unable to distinguish an AI from a person operating the plane.

Their communication system works parallel with a 6-camera hardware package based on the Nvidia AGX Dev Kit. This kit measures the angular speed of objects flying across the videos.

In this world, high angular velocity means low risk — since the object is flying at a fast speed perpendicular to the camera plane.

Low angular velocity indicates high risk since the object could be flying directly at the plane, headed for a collision.

Links

Underwater Human-Robot Interaction #ICRA2022

$
0
0

How do people communicate when they are underwater? With body language, of course.

Marine environments present a unique set of challenges that render several technologies that were developed for land applications completely useless. Communicating using sound, or at least as people use sound to communicate, is one of them.

Michael Fulton tackles this challenge with his presentation at ICRA 2022 by using body language to communicate with an AUV underwater. Tune in for more.

His poster can be viewed here.

Michael Fulton

Michael Fulton is a Ph.D. Candidate at the University of Minnesota Twin Cities. His research focuses primarily on underwater robotics with a focus on applications where robots work with humans. Specifically, human-robot interaction and robot perception using computer vision and deep learning, with the intent of creating systems that can work collaboratively with humans in challenging environments.

Viewing all 284 articles
Browse latest View live