Entropy, Information, Landauer's Limit and Moore's Law

Document Type

Conference Paper


This item is available under a Creative Commons License for non-commercial use only



Publication Details

ISSC 2014/CIICT 2014 Conference, Limerick, Ireland, June 26-27.


In this paper we explore the link between information and entropy by considering the infamous Maxwell demon thought experiment. A non-rigorous mathematical solution by Leo Szilard established this link for the first time, as did Claude Shannon nineteen years later. In 1961, Rolf Landauer's mathematical solution resulted in the Landauer limit, which is still being hotly debated, but here we discuss the implication of this limit on Moore's law and future growth in computing power. A workaround the limit is proposed using an Analogue Artificial Neural Network (AANN). Here, we mimic the action of a human brain synapse formed from memristance connected between two Fitzhugh-Nagumo (FN) neuron models. All designs were simulated in Orcad PSpice${}^{\copyright}$ version 16.5, but a master-slave synapse was built, tested and outputs compared to simulation results. The synapse was also connected in a star-type network which displayed chaotic-type behaviour for certain parameter values.

This document is currently not available here.