The robotics devolution is here, and it’s down right freaky. Soft robotics are set to gain the ability of human senses, soft exoskeletons, and will shortly be able to change shapes.

[fvplayer src=”https://player.vimeo.com/external/195487032.hd.mp4?s=85875f4a3b1a5c82d932b49c38e98d8876fc0251&profile_id=174″ splash=”https://christianjournal.net/wp-content/uploads/2016/12/octobot_015_605.jpeg” playlist=”https://player.vimeo.com/external/196759773.hd.mp4?s=a720f456487e8586d61f57ff702161a492de2e26&profile_id=174,https://christianjournal.net/wp-content/uploads/2016/12/Capture8-950×531-1.jpeg;https://player.vimeo.com/external/183257458.hd.mp4?s=5a1523e913b0574f15602c325ce0f3cd512dec64&profile_id=119,https://christianjournal.net/wp-content/uploads/2016/09/4th.jpg” preroll=”4″ postroll=”5″]

In August of 2016, the very first soft autonomous robot was created by a team at Harvard University with expertise in 3-D printing, mechanical engineering, and microfluidics has demonstrated the first autonomous, untethered, entirely soft robot.

The small, octobot, was 3-D printed and paved the way for autonomous ‘soft’ robotics. Within a couple of months, a team at Harvard John A. Paulson School of Engineering and Applied Sciences and the Wyss Institute for Biologically Inspired Engineering created a finger like soft robot that can mimic human movements.

“Rather than designing these actuators empirically, we wanted a tool where you could plug in a motion and it would tell you how to design the actuator to achieve that motion,” said Katia Bertoldi, the John L. Loeb Associate Professor of the Natural Sciences and coauthor of the paper.

“The design is so complicated because one actuator type is not enough to produce complex motions,” said Fionnuala Connolly, a graduate student at SEAS and first author of the paper. “You need a sequence of actuator segments, each performing a different motion and you want to actuate them using a single input.”

“This research streamlines the process of designing soft robots that can perform complex movements,” said Conor Walsh, the John L. Loeb Associate Professor of Engineering and Applied Sciences, Core Faculty Member at the Wyss Institute for Biologically Inspired Engineering and coauthor of the paper. “It can be used to design a robot arm that moves along a certain path or a wearable robot that assists with motion of a limb.”

Around the same time, a different team of engineers at Cornell University created sensors which allow robots to ‘sense’ internally just like humans. Most robots today use external sensors to detect the exterior conditions, but the engineers developed stretchable optical waveguides, which send impulses towards sensors, which are inside the robot, thus mimicking humans.

“Most robots today have sensors on the outside of the body that detect things from the surface,” said doctoral student Huichan Zhao, who led the study. “Our sensors are integrated within the body, so they can actually detect forces being transmitted through the thickness of the robot, a lot like we and all organisms do when we feel pain, for example.”

The optical waveguides flex and elongate when touched and pressed, modifying the way light propagates through them to a sensing photodiode at the heart of the system. “If no light was lost when we bend the prosthesis, we wouldn’t get any information about the state of the sensor,” said Robert Shepherd, assistant professor of mechanical and aerospace engineering and principal investigator of the Organic Robotics Lab at Cornell University. “The amount of loss is dependent on how it’s bent.”

The group used its optoelectronic prosthesis to perform a variety of tasks, including grasping and probing for shape and texture. The hand was able to scan three tomatoes and determine, by softness, which was the ripest.
Click!

The team foresees the technology could have many potential uses including in prosthetics, bio-inspired robotics and space exploration.

The researchers hope to increase the sensory capabilities of the optical waveguides by 3D-printing more complex sensor shapes. They also intend to incorporate machine learning into the system to decouple signals from an increased number of sensors.

But, it gets even creepier, as if human senses, flexible fingers, and octopuses wasn’t enough – now they will also have ‘hair.’

[fvplayer src=”https://player.vimeo.com/external/196765132.hd.mp4?s=f2f845a85e0d1fd150699d559c9c9bc6f9c1e5b4&profile_id=174″ splash=”https://christianjournal.net/wp-content/uploads/2016/12/Capture8-950×531-1.jpeg”]

And to top that off, robotics will soon also have the ability to ‘morph’ into various shapes. Recall the terminator? Well, a different team at Cornell University chemically created a hybrid material out of hard metal and soft, porous rubber foam that combines the best properties of both. The chemical reaction allows for stiffness when it’s called for, and elasticity when a change of shape is required.

Cornell University engineering professor Rob Shepherd said: “It’s sort of like us — we have a skeleton, plus soft muscles and skin.” Shepherd continued, “Unfortunately, that skeleton limits our ability to change shape — unlike an octopus, which does not have a skeleton.”

After seeing all of the evidence, one question remains – Are the robots that science and technology are producing meant to replace humans? Let us know in the comments below!

Shortly, soft robots will walk among us. In fact plans for that are already in the making. It’s called The Fourth Industrial Revolution, and the main aspect of it is, instead of mankind creating better tools, mankind becomes the tool. Industry 4.0 is the current trend of automation and data exchange in manufacturing technologies. It includes cyber-physical systems, the Internet of things and cloud computing. The 4th will forever blur the lines between the artificial and the natural. (Please see the third video in the playlist above)

The fields of science and technology have brought the world to the brink, and as mankind rapidly tips over the edge; many don’t realize the consequences of the Artificial Intelligence and robotic devolution that is overtaking the natural way of life.

Works Cited

Troy Oaks. “Is Morphing Metal the Future of Soft Robotics? Study Says Yes .” Vision Times. . (2016): . . http://bit.ly/2ikoaTF

Leah Burrows. “ The first autonomous, entirely soft robot .” Harvard Gazette. . (2016): . . http://bit.ly/2igwd81

Michael Wehner, Ryan L. Truby, Daniel J. Fitzgerald, Bobak Mosadegh, George M. Whitesides, Jennifer A. Lewis, & Robert J. Wood. “An integrated design and fabrication strategy for entirely soft, autonomous robots.” Nature. . (2016): . . http://go.nature.com/2hw2cwJ

Leah Burrows. “Mimicking biological movements with soft robots.” Harvard. . (2016): . . http://bit.ly/2ikwZg5

Huichan Zhao, Kevin O’Brien, Shuo Li, and Robert F. Shepherd. “Optoelectronically innervated soft prosthetic hand via stretchable optical waveguides.” Science Mag. . (2016): . . http://bit.ly/2hdWYb8

Tereza Pultarova. “ Robot can sense internal touch just like humans.” E&T. . (2016): . . http://bit.ly/2igBiwR

Jian Zhang. “Electronic 'hairy skin' could give robots a more human sense of touch.” American Chemical Society. . (2016): . . http://bit.ly/2hw1i3s