The resurgence of neural networks in the late 20th century can be alluded to various factors.

Algorithmic advancements

The foundational concepts of neural networks included simple models like the perceptron. These models were limited due to not being able to solve non-linearly separable problems. A robust training method for multi-layered networks was also lacking.

However, in the late 20th century, computer scientists such as Hilton popularised the back-propagation algorithm to train multi-layered neural networks, although he was not the first to propose the approach. Through back propagation, more accurate multi-layered neural networks could be trained.

Increased computational power

The late 20th century led to the popularisation of digital computers. Perceptrons and neural networks could be easily emulated on the digital computers, yielding way better results than a purpose-built perceptron machine. Hence, various simulation programs for multiple fields such as character recognition were created.