NEUROMECHANICAL AUTOENCODERS: LEARNING TO COUPLE ELASTIC AND NEURAL NETWORK NONLINEARITY

Abstract

Intelligent biological systems are characterized by their embodiment in a complex environment and the intimate interplay between their nervous systems and the nonlinear mechanical properties of their bodies. This coordination, in which the dynamics of the motor system co-evolved to reduce the computational burden on the brain, is referred to as "mechanical intelligence" or "morphological computation". In this work, we seek to develop machine learning analogs of this process, in which we jointly learn the morphology of complex nonlinear elastic solids along with a deep neural network to control it. By using a specialized differentiable simulator of elastic mechanics coupled to conventional deep learning architectureswhich we refer to as neuromechanical autoencoders-we are able to learn to perform morphological computation via gradient descent. Key to our approach is the use of mechanical metamaterials-cellular solids, in particular-as the morphological substrate. Just as deep neural networks provide flexible and massivelyparametric function approximators for perceptual and control tasks, cellular solid metamaterials are promising as a rich and learnable space for approximating a variety of actuation tasks. In this work we take advantage of these complementary computational concepts to co-design materials and neural network controls to achieve nonintuitive mechanical behavior. We demonstrate in simulation how it is possible to achieve translation, rotation, and shape matching, as well as a "digital MNIST" task. We additionally manufacture and evaluate one of the designs to verify its real-world behavior.

1. INTRODUCTION

Mechanical intelligence, or morphological computation (Paul, 2006; Hauser et al., 2011) , is the idea that the physical dynamics of an actuator may interact with a control system to effectively reduce the computational burden of solving the control task. Biological systems perform morphological computation in a variety of ways, from the compliance of digits in primate grasping (Jeannerod, 2009; Heinemann et al., 2015) , to the natural frequencies of legged locomotion (Collins et al., 2005; Holmes et al., 2006; Ting & McKay, 2007) , to dead fish being able to "swim" in vortices (Beal et al., 2006; Lauder et al., 2007; Eldredge & Pisani, 2008) . Both early (Sims, 1994) and modern (Gupta et al., 2021) work have used artificial evolutionary methods to design mechanical intelligence, but it has remained difficult to design systems de novo that are comparable to biological systems that have evolved over millions of years. We ask: Can we instead learn morphological computation using gradient descent? Morphological computation requires that a physical system be capable of performing complex tasks using, e.g., elastic deformation. The mechanical system's nonlinear properties work in tandem with neural information processing so that challenging motor tasks require less computation. To learn an artificial mechanically-intelligent system, we must therefore be able to parameterize a rich space of mechanisms with the capability of implementing nonlinear physical "functions" that connect input forces or displacements to the desired output behaviors. There are various desiderata for such a 1

