Whither computing
“A computer once beat me at chess, but it was no match for me at kick boxing.” —Emo Philips
“There is plenty of room at the bottom.” —Richard P. Feynman
There is a significant push toward mimicking biological systems in the computing machinery of tomorrow, including perception and cognition displayed by rat brains and the vast information storage present in DNA (e.g., Exabytes of storage in small DNA clusters), which Feynman alluded to in his APS Meeting talk in 1959.
Some of us worked in hardware in the early 1980s and focused on addressing issues related to hierarchical interconnects, thermal management, and transistor performance. Even then, leadership was cognizant that making generational advances in device density on chips came with substantial challenges to off-chip interconnects and thermal management. The system-level software and hardware architecture gurus had to be made aware that how they map processes on the devices could lead to hot spots, and micron-scale hot regions could drop chip speeds by an order of magnitude. It was then that system technology co-design and co-optimization began to take full effect.With significant advances in lithography and resultant device miniaturization, the need for higher density off-chip interconnections grew, leading to smaller bump-sizes in IBM's controlled collapse of chip connects (C4) technologies. The hot spots kept becoming hotter, despite directed efforts to reduce them, leading to many novel thermal management solutions. Work on high-performance computing helped to further understand the overall optimization of the “computing stack.”
Research directions, pursued then by the semiconductor physics, interconnects, and thermal management communities, have endured, with one new development: biosystems-based computing. Researchers are focusing on how to reproduce the “learning” ability of these systems. Examples that leverage neuroplasticity include the Human Brain project in Europe, the earlier Brain Port (by neuroscientist Paul Bach-y-Rita), and the Brain Gate project at Brown and Stanford Universities. Initial demonstrations involved sensory substitution that enabled the blind to “see” by using a chair designed by Bach-y-Rita. The ability of the neuronal bundle to learn was demonstrated by a hybrid robot, Hybrot, which began with the brain of a rat (Georgia Institute of Technology) that could “drive” around obstacles, a feat that would require substantial computation with pure electronic systems.
Current projects are exploring the learning capabilities of cell bundles through cell cultures, grown on electrodes, subjected to stimuli. Research also focuses on understanding the complex coordination between the genetic material in the cells with its encoded information and other biological functions that drive their cellular “lives.” Such comprehensive understanding enables the biocomputing community to architect hybrid systems with “living” biological components interfaced with electronics and to provide previously unrealized learning capabilities. Materials scientists in both the “hard” and “soft” matter areas are working to study the interfaces and electrochemical interactions in these complex systems.
These experiments have an underlying interest in understanding how the brain hardware and its computing functions work, developing our perception of the surrounding world, and creating knowledge. Understanding the way brains in different species work will allow the development of biocomputing systems with the desired properties and an important common feature—low energy operation. And throughout this venture, materials scientists will need to be in step with neuroscientists, medical practitioners, psychologists, ethicists, sociologists, and of course computer scientists and engineers.