Robotics and Electronics

We are interested in making machines that move in complex environment, change shapes and adapt to external conditions. Living systems have mastered all of those in their fight for survival and we use their strategies to create new generation machines.

Biorobotics Laboratory (prof. Auke Ijspeert)

Swimming-Crawling-Walking Robotics

At the Biorobotics Lslamander robotab we work on the computational aspects of movement control, sensorimotor coordination, and learning in animals and in robots. We are interested in using robots and numerical simulation to study the neural mechanisms underlying movement control and learning in animals, and in return to take inspiration from animals to design new control methods for robotics as well as novel robots capable of agile locomotion in complex environments.

  1. Amphibious Robotics
  2. Quadruped Robotics
  3. Modular Robotics
  4. Humanoid Robotics
  5. Neural mechanisms of movement and learning for novel robots capable of agile locomotion in complex environments – We work on the computational aspects of movement control, sensorimotor coordination, and learning in animals and in robots. We are interested in using robots and numerical simulation to study the neural mechanisms underlying movement control and learning in animals, and in return to take inspiration from animals to design new control methods for robotics as well as novel robots capable of agile locomotion in complex environments.
Watch prof Auke Ijspeert’s TED talk

Laboratory of Intelligent Systems (prof. Dario Floreano)

bird like droneAerial Robotics

We design flying robots, or drones, with rich sensory and behavioural abilities that can change morphology to smoothly and safely operate in different environments. These drones are conceived to work cooperatively and with humans to power civil applications in transportation, aerial mapping, agriculture, search-and-rescue, and augmented virtual reality.

Lab Youtube chanel for all the robot videos.

 

 

Evolutionary Robotics – RobGen

The promise of Evolutionary Robotics to completely automatize the design of robot controllers and/or morphologies is an idea with great appeal not only to researchers, but also to students. Recently, we introduced the RoboGenTM an open-source software and hardware platform for Evolutionary Robotics, and described its success as an educational tool in a masters level course at EPFL. There it was shown that RoboGen could provide students with valuable hands on experience with Evolutionary Robotics, neural networks, physical simulation, 3D printing, mechanical assembly, and embedded processing.

 

insect eye cameraInsect inspired miniature artificial eye

In most animal species, vision is mediated by compound eyes, which offer lower resolution than vertebrate single-lens eyes, but significantly larger fields of view with negligible distortion and spherical aberration, and high temporal resolution in a tiny package. Compound eyes are ideally suited for fast panoramic motion perception. Here we describe a novel design method for biomimetic compound eyes featuring a panoramic, undistorted field of view in a very thin package. The artificial compound eye, that was prototyped in our Lab, possesses several characteristics similar to the eye of the fruit fly Drosophila and other arthropod species. This design method opens up new vistas for a broad range of applications where wide field motion detection is at a premium, such as collision-free navigation of terrestrial and aerospace vehicles, and for the experimental testing of insect vision theories.

Floreano et al, 2013, Miniature curved artificial compound eyes, PNAS doi:10.1073/pnas.1219068110

For more info: http://curvace.org/

http://news.epfl.ch/news/an-insect-eye-for-drones/

Research Lab: prof. Dario Floreano, Laboratory of Intelligent Systems

 

Reconfigurable Robotics Lab (prof. Jamie Paik)

robogami

While Origami, the traditional Japanese art of paper folding, is primarily known for its artistic significance, many of its components can be found in nature. From insect wings to different types of leaves, organisms take advantage of the ability to create thin and lightweight structures by folding quasi-two-dimensional elements in distinct patterns. At the Reconfigurable Robotics Lab, RRL, we are developing robots that mimic and enhance the benefits of Origami found in nature. Our robots incorporate quasi-two-dimensional smart structures with embedded actuation, sensing, control, and communication, allowing them to change both shape and functionality according to the task at hand. By automating these structural transformations and functional reconfigurations, we are creating versatile robotic systems that are compact for storage and transportation and, when deployed, can self-assemble into lightweight, three-dimensional tools.

Multi modal locomotion using robotic origami

We are exploring the new paradigm in robotics, robotic origami, by developing foldable quasi-2D structures with embedded functionality, which can self-transform into desired 3D shapes. Robogami, due to its smart, reconfigurable and highly adaptable nature, can be employed as a robot with multi-gait mobility. Thanks to the low-vost and fast manufacturability of the origami robots in quasi-2D, it is possible to fabricate several of them for search and rescue missions, with disposable characteristics. While designing our multi-gait origami robots, we investigate their folding geometry, mechanisms, actuation and sensing, materials, fabrication methods, and control strategies. We study different locomotion methods for Robogamis inspired by nature, from crawling to jumping, using foldable structures and selective activation of the folding mechanisms.

 
Robogami – next paradigm of robots with augmented adaptability to conform to the unexpected and ever changing environment

Robotic Origamis (Robogamis) are functional robots constructed from smart materials that embed actuation, computation, communication, sensing, and interface with other components such as specialized sensors. Robogamis are designed and fabricated as flat sheets that can adapt to their environment by transforming their form using embedded actuators, sensors and controller.

 

Soft Robots that mimic human muscels for patient rehabilitation

robots mimic muscelsA RRL lab is developing soft, flexible and reconfigurable robots. Air-actuated, they behave like human muscles and may be used in physical rehabilitation. They are made of low-cost materials and could easily be produced on a large scale.

EPFL news article: http://actu.epfl.ch/news/soft-robots-that-mimic-human-muscles/

 

 
Silicone embedded distributed piezoelectric sensors for contact detection

sensing soft robotics

The objective of this project is to design a soft, flexible, thin substrate capable of detecting contact. Such system can be used for wearable applications thus enabling the evaluation of external forces and for functionalizing the surface of surgical tools or robotics system allowing fast response contact detection. Layer by layer fabrication technique joined with materials selection allow the achievement low fabrication cost, scalability, fast prototyping and fine tuning of sensor transfer function.

Research Lab: prof. Jamie Paik, Reconfigurable Robotics Lab

MicroBioRobotic Systems Laboratory (prof. Selman Sakar)

Biomimetic soft micromachines

Nature provides a wide range of inspiration for building mobile micromachines that can navigate through confined heterogenous environments and perform minimally invasive biomedical operations. We are developing rapid prototyping processes based on selective patterning and programmable self-folding for building biomimetic soft micromachines with three-dimensional mechanisms connected through a flexible backbone

360° panoptic camera inspired by insect vision

The Panoptic camera is a biologically-inspired multicamera vision sensor. It is a polydioptric system mimicking the eyes of flying insects where multiple imagers, each with a distinct focal point, are distributed over a hemisphere. Our research activities in this domain are concentrated on the realization of customized HW imaging platforms, and implementation of omnidirectional image reconstruction algorithms (OIR) on the Panoptic camera platform. The combined hardware/software system enables advanced real-time applications including omnidirectional image reconstruction, 3D model construction (3D display) and depth estimation.

Video that explains the project.

Research lab:developed jointly by prof. Yusuf Leblebici, Microelectronic Systems Laboratory and prof.Pierre Vandergheynst, Signal Processing Laboratory (LTS2)

Biomimetic behavior of an active tensegrity structure

Biomimetic structures interact with their environment, change their properties, learn and self-repair, thereby providing properties that are similar to living organisms. Interactions with the environment involve unique challenges in the field of computational control, algorithms, damage tolerance, and structural analysis. Tensegrity  structures are pin-jointed structures of cables and struts in a self-stress state. Tensegrity structures are suitable for active control since the shape of the structure can be changed by changing the length of the elements. When a tensegrity structure integrates sensors, actuators (devices that allow structural members to change their length) and a control system, deployment and modification of structural behavior becomes feasible. We are developing active control systems for tensegrity strucutres that are capable of self-diagnosis, self-adaptation and learning.

N. Veuve, S.D. Safaei and I.F.C. Smith, Towards development of a biomimetic tensegrity footbridge

Research Lab: prof. Ian Smith, Applied Computing and Mechanics Laboratory

Bionic Limbs

bionic fingerMyoelectric prosthetic hands and fingers allow limb amputees to regain the ability to perform several tasks involved in every day living, representing a significant functional gain. Despite these advantages, they are often rejected by patients. Amongst the most common reasons cited for this reaction is the lack of sensory feedback associated with currently available prostheses, forcing users to rely on vision to guide their movements. One of the major goals in the development of future upper limb prostheses is thus the addition of sensory feedback. We pursue two parallel approaches for restoring touch in upper limb amputees: an invasive approach (using TIME electrodes implanted into the nerves), and a noninvasive approach (where we use electrodes placed on the surface of the skin).

EPFL news article for more information regardin the lates Bionic Fingertip: https://actu.epfl.ch/news/amputee-feels-texture-with-a-bionic-fingertip-5/

Research Lab: prof. S. Micera,  Translational Neural Engineering Lab