"toy around with the architecture" What is architecture in Tensorflow?

I saw this text " Define the model, you can toy around with the architecture." in a jupyter notebook. I am a software developer and usually when we say architecture, it means there are many larger components. So I am curious what it means in Data science world or in TensorFlow

I see it used in the machine learning literature to refer to the pattern of layers in a neural network. Single input and output versus multiple? does it incorporate Pooling layers? Dropout? Convolution? Are there groups of layers that repeat? Linear direct flow or skips? Etc

1 Like