[go: up one dir, main page]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error building BigM reformulation for ReLU activations #150

Open
isaingit opened this issue Jun 17, 2024 · 2 comments
Open

Error building BigM reformulation for ReLU activations #150

isaingit opened this issue Jun 17, 2024 · 2 comments

Comments

@isaingit
Copy link
isaingit commented Jun 17, 2024

Dear developers,

I am getting the following exception when trying to embed a ReLU network in my optimization model:
Exception has occurred: TypeError
'>' not supported between instances of 'NoneType' and 'int'

This is the piece of code that triggers the error:

model = ConcreteModel()
model.nn = OmltBlock()

keras_model = tf.keras.models.load_model('ip_model_1406S.keras', compile=False)
input = tf.keras.Input(shape = (30,), name = 'IN') # n_scenarios x n_periods*n_products. This command defines an input tensor, but it is not a layer! 
fwd_model = tf.keras.Sequential()
fwd_model.add(tf.keras.Input(shape = (30,), name = 'IN'))
fwd_model.add(tf.keras.layers.Dense(units = keras_model.layers[6].units , activation = 'relu', name='H1'))
fwd_model.add(tf.keras.layers.Dense(units = keras_model.layers[7].units , activation = 'relu', name='H2'))
fwd_model.add(tf.keras.layers.Dense(units = keras_model.layers[8].units , activation = 'linear', name='OUT'))

fwd_model.layers[0].set_weights(keras_model.layers[6].get_weights())
fwd_model.layers[1].set_weights(keras_model.layers[7].get_weights())
fwd_model.layers[2].set_weights(keras_model.layers[8].get_weights())

net = load_keras_sequential(fwd_model)
formulation_comp = ReluBigMFormulation(net)
model.nn.build_formulation(formulation_comp)

I'd really appreciate it is someone can explain the error and provide a possible fix.

Thanks in advance!

@zkilwein
Copy link

I encountered the same error a few weeks ago. Make sure you provide bounds on the input variables for ReLU Big-M

scaled_ins={0:(0,1),1:(0,1),...,29:(0,1)}
net = load_keras_sequential(fwd_model, scaled_input_bounds=scaled_ins)

@isaingit
Copy link
Author

Thank you very much for spotting the issue, @zkilwein! That was really helpful.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants