ValueError: If specifying TensorSpec names for nested structures, either zero or all names have to be specified

Hi, I’m trying to export GAT in TF 2.1.0 SavedModel format, I was previously able to do so, but after trying to transition it to a more concise code the following error appeared:

`~/anaconda3/envs/recom/lib/python3.7/site-packages/tensorflow_core/python/framework/ in _get_defun_inputs(args, names, structure, flat_shapes)
1189 specified_names = [ for arg in tensor_specs if]
1190 if specified_names and len(specified_names) < len(tensor_specs):
-> 1191 raise ValueError("If specifying TensorSpec names for nested structures, "
1192 “either zero or all names have to be specified.”)

ValueError: If specifying TensorSpec names for nested structures, either zero or all names have to be specified.`

To reproduce simply create a python 3.7 env, install stellargraph and its dependencies with pip and after training the model in the GAT demo, try to export it in the next cell with:

tf.keras.models.save_model(model, export_dir)

or model, export_dir)

or even

I’m running on macOs Catalina 10.15 and Anaconda.

Any guidance would be appreciated!

Thanks for letting us know. It seems this affects all of our “full batch” models. I’ve filed but we haven’t started investigating it yet. One temporarily work-around would be switch to a different model, such as GraphSAGE:

We’ll be updating that issue and this thread as we resolve it.

1 Like

Thanks! Just an update, I was able to export it using:
But was unable to import it back with:
emb_model = tf.compat.v1.keras.experimental.load_from_saved_model(
import_path, custom_objects=None

~/anaconda3/envs/recom/lib/python3.7/site-packages/tensorflow_core/python/keras/utils/ in class_and_config_for_serialized_keras_object(config, module_objects, custom_objects, printable_module_name)
    248     cls = module_objects.get(class_name)
    249     if cls is None:
--> 250       raise ValueError('Unknown ' + printable_module_name + ': ' + class_name)
    252   cls_config = config['config']

ValueError: Unknown layer: SqueezedSparseConversion

Ah, good find. I’ll make a note on the issue about that too.

StellarGraph defines various custom Keras layers, which need to be passed to custom_objects. This can either be done manually, maybe something like:

emb_model = tf.compat.v1.keras.experimental.load_from_saved_model(
    custom_objects={"GraphAttentionSparse": sg.layer.GraphAttentionSparse, "SqueezedSparseConversion": sg.layer.SqueezedSparseConversion},

(Swapping GraphAttentionSparse to GraphAttention, if you created the FullBatchNodeGenerator with sparse=False.)

Or automatically using the custom_keras_layers variable that’s available at the top level of stellargraph:

emb_model = tf.compat.v1.keras.experimental.load_from_saved_model(

I’ve filed about making this more obvious, since it applies even if we fix #1251 above.