February 21, 2024

The main purpose of this workshop is to give an insight to our students into the research topics currently running in the Robotics Lab, facilitating connection to the work of the lab. In addition, we have invited researchers from closely related fields to give a presentation on their ongoing research and achievements. 

Organizer: Dr. Miklós Koller, e-mail: koller.miklos@itk.ppke.hu

(The lectures will be 30 minutes long with additional 5 minutes for questions.) 

  • 13:00-13:35 Prof. György Cserey: Artificial Intelligence and Augmented Reality serving DISASTER PROTECTION with search and RESCUE 
    • In the presentation, we will demonstrate how to facilitate rescue operations using artificial intelligence-based applications and augmented reality. 
    • Our project is motivated by the tragedy of the Hableány sightseeing boat sinking in the Danube, aiming to implement a system that significantly eases the search and rescue of victims during a disaster and enables operations under difficult conditions. In such situations, professionals and participants in the rescue face numerous challenges. In the river, due to the visibility conditions and the current, they were essentially groping in the dark. 
  • 13:35-14:10 Prof. Dominik Belter (Institute of Robotics and Machine Intelligence, Poznan University of Technology): Neural-based Scene Reconstruction for Robotic Grasping 
    • The next step in the development of perception systems for robots is to inference about the properties and meaning of objects in the environment. Robots, unlike humans, have limited capabilities to reconstruct 3D objects using a single RGB-D image. Robots face challenges in perceiving new scenes, particularly when registering objects from a single perspective, resulting in incomplete shape information about objects. Partial object models negatively influence the performance of grasping methods. To address this, robots can scan the scene from various perspectives or employ methods to directly fill in unknown regions. 
    • This research reexamines scene reconstruction typically formulated in 3D space, proposing a novel formulation in 2D image space for robots with RGB-D cameras. With such a perception system, the robots are capable of better planning their motion and interacting with the objects based on a single image of the environment, without the need for time-consuming scanning. 
  • 14:10-14:15 break 
  • 14:15-14:50 Prof. Margaret Coad (Innovative Robotics and Interactive Systems Lab, University of Notre Dame): Soft and Continuum Robots for Unstructured Environments
    (08:15-08:50 am in Eastern Standard Time)
    • Soft and continuum robots have immense potential to assist humans with tasks that require navigation and manipulation in unstructured environments. In this talk, I present my group's research on the design, modeling, and control of a variety of soft and continuum robots. I begin by discussing soft vine-inspired robots, which move through their environment by extending from their tip and are well suited for navigation and manipulation within confined spaces. In particular, I discuss our research on vine robot field deployment, shape sensing, force sensing, and collapse modeling. I then present our research on two other bioinspired robots: spider monkey tail-inspired robots for grasping objects, and amoeba-inspired robots for navigation in confined spaces. Finally, I discuss our research on soft wearable robots for replacing or assisting the motion of the upper limbs. This research helps make robots more capable of assisting humans in the unstructured environments of everyday life. 
  • 14:50-15:25 József Benedek Tasi: How to build anatomical prosthetics - mimicking biological tissues in robotics 
    • Current prosthetics are mostly simplified human hands - robust tools capable of the most important grasps and gestures, but not much more. 
    • But what if we wish to make them more human-like? As flexible as soft robots, but durable like their industrial siblings? How do we go about mimicking the intricacies of our anatomy? What technologies and materials to use for creating artificial soft tissues? What processes can automate the production of such personalized devices? 
    • These are the questions we seek to answer in the newly formed Prosthetics lab. Come and join the debate! 
  • 15:25-16:00 Dr. Miklós Koller: Control interfaces to a hand prosthesis, biomechanical modeling and control  
    • In prosthesis control, the standard, off-the-shelf solutions involve few, at most 8-channel surface electromyographical measurements (muscle activation signals measured on the surface of the skin). There are other modalities to uncover the subject's intention or underlying muscle activation, but these exist only in reseach-phase. 
    • In this lecture we plan to shortly review the different prosthetic control modalities we are investigating with our students. This involves surface electromyography, near infrared analysis and ultrasound measurements. 
    • We also deal with the simulational examination of different biomechanical structures, the overview of the different approaches will be also presented here. 
  • 16:00-16:05 break 
  • 16:05-16:40 Prof. Seungmoon Song (The Neuromechanics of Movement Laboratory, Northeastern University): Towards Digital Motor Clones - predictive neuromechanical simulation of human locomotion
    (10:05-10:40 am in Eastern Standard Time)
    • I will provide an overview and outlook of our research on neuromechanical simulations of human locomotion. Our long-term goal is to develop digital motor clones that predict how individuals move in novel scenarios, with significant implications for studying human physiology, testing assistive devices, and controlling robotic systems. We proposed a reflex-based control model that generates diverse human-like locomotion using a musculoskeletal model in physics simulations. The model has been extended to explain elderly gait, predict the performance of gait assistive ankle exoskeletons, and control robotic systems. I will discuss how we plan to customize these simulations to predict and explain assisted gaits of individuals, and develop versatile motor control models using deep reinforcement learning to cover atypical locomotion behaviors. Additionally, I'll delve into our collaborative efforts surrounding MyoSuite, an integrated open-source platform for neuromechanical simulation. 
  • 16:40-17:15 Dr. Sándor Földi: Measurement and diagnostic applications of continuous blood pressure waveforms 
    • Blood pressure is a vital signal that can give important information about a person's health. Analyzing the continuous blood pressure waveform can give even more information; therefore, it can be a great diagnostic option. However, measuring a good-quality continuous blood pressure signal is challenging, so its application in diagnostics is still quite rare. 
    • In this lecture, the following topics will be covered: First, there will be a brief summary of how the blood pressure wave is created, how it propagates through the arteries, and what a typical signal waveform looks like. Then, several measurement solutions are introduced, including the one that is developed (and is still under development) in our faculty. In the third part of the lecture, a short summary is presented of how the continuous blood pressure signal waveform can be utilized for diagnostic purposes.