[SCIP] Assertion error with barrier algorithm in debug mode
Bayramoglu, Selin
sbayramoglu3 at gatech.edu
Tue Jun 17 20:12:19 CEST 2025
Hi all,
I'm working on a custom cut selector in SCIP 8.0.0 and using the barrier algorithm in diving mode, following the method from this GitHub repo<https://github.com/Opt-Mucca/Analytic-Center-Cut-Selection>. Specifically, I compute the analytic centers of the optimal face and the LP polyhedron at the current node by doubling the original objective and setting it to 0 in diving mode, respectively.
I use Gurobi as the LP solver and apply the following diving settings:
lp/initalgorithm = b
lp/resolvealgorithm = b
lp/checkdualfeas = FALSE
lp/disablecutoff = 1
lp/solutionpolishing = 0
lp/checkstability = FALSE
lp/checkfarkas = FALSE
lp/presolving = FALSE // added by me
lp/scaling = 0 // added by me
Everything runs fine in normal mode, but in debug mode I hit this assertion error:
scip/src/lpi/lpi_grb.c:4116: SCIPlpiGetObjval: Assertion `lpi->solstat != GRB_OPTIMAL || oval == obnd' failed.
oval holds the correct objective value, but obnd is -1e+100.
Any ideas on what might be causing this? If there is a better selection of diving settings, I’d be happy to learn that as well.
Best regards,
Selin Bayramoglu
Ph.D. Student
H. Milton Stewart School of Industrial and Systems Engineering
Georgia Institute of Technology
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listserv.zib.de/pipermail/scip/attachments/20250617/51c0c30f/attachment.html>
More information about the Scip
mailing list