Our neurons seem capable of handling any type of data, regardless of its scale or statistical properties. In this letter, we suggest that optimal coding may occur at the single-neuron level without requiring memory, adaptation, or evolutionary-driven fit to the stimuli. We refer to a neural circuit as optimal if it maximizes the mutual information between its inputs and outputs. We show that often encountered differentiator neurons, or neurons that respond mainly to changes in the input, are capable of using all their information capacity when handling samples of any statistical distribution. We demonstrate this optimality using both analytical methods and simulations. In addition to demonstrating the simplicity and elegance of neural processing, this result might provide a way to improve the handling of data by artificial neural networks.