1. Introduction

  2. Some background for deep learning

    1. Representation of the numerical data

    2. Perceptron

    3. Multilayered networks and deep learning

  3. Sequential models

    1. Overall view

    2. Some key notions

    3. Notions of feature hierarchy and self-attention

  4. How attention could be realized in quantum biology according to TGD?

    1. The notion of magnetic body

    2. How bits and qubits could be represented?

    3. How communications and control could be realized?

    4. Could p-adic topologies provide a model for  feature hierarchies?

    5. An analog of a multi-perceptron model related to holography in TGD