Perceptron Learning and Classification in a Modeled Cortical Pyramidal Cell

Toviah Moldwin*, Idan Segev

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

22 Scopus citations

Abstract

The perceptron learning algorithm and its multiple-layer extension, the backpropagation algorithm, are the foundations of the present-day machine learning revolution. However, these algorithms utilize a highly simplified mathematical abstraction of a neuron; it is not clear to what extent real biophysical neurons with morphologically-extended non-linear dendritic trees and conductance-based synapses can realize perceptron-like learning. Here we implemented the perceptron learning algorithm in a realistic biophysical model of a layer 5 cortical pyramidal cell with a full complement of non-linear dendritic channels. We tested this biophysical perceptron (BP) on a classification task, where it needed to correctly binarily classify 100, 1,000, or 2,000 patterns, and a generalization task, where it was required to discriminate between two “noisy” patterns. We show that the BP performs these tasks with an accuracy comparable to that of the original perceptron, though the classification capacity of the apical tuft is somewhat limited. We concluded that cortical pyramidal neurons can act as powerful classification devices.

Original languageEnglish
Article number33
JournalFrontiers in Computational Neuroscience
Volume14
DOIs
StatePublished - 24 Apr 2020

Bibliographical note

Publisher Copyright:
© Copyright © 2020 Moldwin and Segev.

Keywords

  • compartmental modeling
  • cortical excitatory synapses
  • dendritic voltage attenuation
  • machine learning
  • non-linear dendrites
  • perceptron
  • single neuron computation
  • synaptic weights

Fingerprint

Dive into the research topics of 'Perceptron Learning and Classification in a Modeled Cortical Pyramidal Cell'. Together they form a unique fingerprint.

Cite this