A future with artificial intelligence (AI) and technology seems like an advanced concept that is out of our reach. You might see movies about AI, and sometimes it doesn’t look straightforward for us to understand and control, especially when it involves drones.

However, many scientists from the Massachusetts Institute of Technology’s CSAIL strive hard strive to bridge this gap to make A.I. something that humans can appreciate more. One of the projects that they are currently working on is enhancing the control system of drones.

fly drone arm gesture - thesis.ph

Drones are relatively helpful as flying technology that can go into long distances. According to Cameron Chell, the CEO of a drone manufacturing institution Draganfly, they are becoming even more useful during the time of pandemic today, as seen in:

  • China and Spain – They used drones with loudspeakers to publicly shame violators of quarantine.

drones with loudspeakers to publicly shame violators of COVID19 quarantine - thesis.ph

  • Westport, Connecticut – Police are using “pandemic drones” to monitor people’s adherence to social distancing and check their temperature and heart rates.

pandemic drones for social distancing and checker of temperature and heart rate - thesis.ph

Improving its mechanism use will surely help a lot more of our industries and even enhance our safety. Additionally, drones are also convenient in reaching out to remote areas without much workforce needed. It then leads to a potential for industrial applications such as:

  • surveying for construction sites
  • inspecting remote equipment in offshore platforms
  • outreaching to remote infrastructure

In this regard, CSAIL devised a new system that made drones even more convenient and easy to use.

Interaction Between Drones and Human Biofeedback

The system involves using gesture recognition through muscle signals to control the devices, and it’s fascinating how its controls are much specified. This smart technology allows you to control a drone by simple hand and arm muscle movements such as:

  • raising hands to move up
  • moving hands downs to move down
  • clenching fists to move forward
  • contracting biceps to stop
  • twisting hand to the right to rotate the drone clockwise

CSAIL’s project is still a work in progress. Surprisingly, during a testing session of the drone’s responses to gestures, the team found that the drone correctly identified 81% of 1,535 of the gestures. They released a video that briefly explains the entire system by navigating the drone through rings.

Electromyographic electrodes, along with motion sensors worn on biceps, detect motion from electrical signals, which allows the device to differentiate between different hand and arm movements. CSAIL’s Algorithms process these signals from sensors to identify the gestures, and the drone is proficient in reading desired inputs.

A paper published last March 2020 featured the team’s outstanding work on the technology. It relates to the fact that drones interpreting our gestures can make interactions with other machines more similar to our interactions with another person. Smooth robot-human interplay is the primary goal of the team that worked on the technology.

Other than that, MIT’s research on this area revealed that their work could extend to industrial applications and Cobotics, which may require less training and programming to operate. After all, their main goal is to control this drone and other possible technology with ease while incorporating human intuition.

I’m excited to witness and be part of the gradual developments in the field of artificial intelligence. Have you decided to incorporate AI in your thesis? Let me know in the comments below.

LEAVE A REPLY

Please enter your comment!
Please enter your name here