I have found this challenge called "Graph Neural Networking Challenge" https://bnn.upc.edu/challenge/gnnet2023/ and it's going on for 4 years. I have tried to use previous implementations for my curiosity. I was able to create datasets, but I can't train the model, because it's doesn't receive any input according to the error. For input data responsible file "read_dataset" and function "input_fn", which doesn't produce any output even though my datasets have data. Instead, it's just giving empty tensors such as: ({'traffic': TensorSpec(shape=(None,), dtype=tf.float32, name=None), 'packets': TensorSpec(shape=(None,), dtype=tf.float32, name=None), and so on. Can someone help me to understand this tool and how I can make it work, please? I am really trying already for a long time.
I have access to my datasets and I see data there. I have tried to create datasets without function, but them model expects specific as input and wouldn't work.
The full answer isn't clear to me from the question, but I suspect some clarification on tensor shapes may help:
A
Noneentry in atf.TensorShapemeans the actual size of that dimension is unknown. If so, the shape is said to be not fully defined. (A Tensor that's known to be empty will have at least one dimension of size 0, but that's different.)In a
tf.data.Dataset, not fully defined shapes can appear in theDataset.element_specattribute, meaning that the Tensors that will come out of the Dataset may vary in the size of that dimension.It can also happen that a
tf.Tensorobject has a not-fully-defined shape, namely in places where it does not represent one actual, already known tensor value but merely serves as a placeholder in defining some future computation. That may seem a bit unusual but is explained well in TF's Introduction to graphs and tf.function. Notice that passing any Python function toDataset.map()will turn it into atf.function.Finally, not-fully-defined shapes can be found on the symbolic tensors of the "Functional API" of Keras, but from the question that does not seem to be the issue here.