Analyzing complex data and making decisions based on them makes it necessary to develop new methodologies and computational tools that adapt to this revolution. In this sense, Operations Research, together with Statistics and other disciplines related to Data Science, plays a key role in providing explainability to data-driven decision making, as well as in providing in a way that is transparent, efficient and easy to adapt to new situations. In this section, several works related to these topics will be presented, and young researchers will have the opportunity to share and disseminate the latest contributions in this area.
La necesidad de analizar cada vez datos más complejos y de tomar decisiones en base a ellos hace que surja la necesidad de desarrollar nuevas metodologías y herramientas computacionales que se adapten a esta revolución. En este sentido la Investigación Operativa, junto con la Estadística y otras disciplinas ligadas a la Ciencia de Datos, juega un papel fundamental a la hora de dotar de explicabilidad a los procesos de toma de decisiones, así como a dar soluciones de una manera transparente a la vez que eficiente y fácil de adaptar a nuevas situaciones. En esta sesión, se presentarán trabajos relacionados con esta temática donde los y las ponentes tendrán la oportunidad de compartir y dar difusión a sus últimas contribuciones en el área.
90Bxx
(primary)
1.B (0.18)
1.C (0.18)
2.B (0.18)
We propose a data-driven approach for constructing firmly nonexpansive (FNE) operators. We demonstrate its applicability in Plug-and-Play methods and we provide sound mathematical background to the problem of learning FNE operators via expected and empirical risk minimization. Further, we derive a solution strategy that ensures FNE and piecewise affine operators within the convex envelope of the training set. We finally show its applicability in image denoising problems.
Joint work with Kristian Bredies and Emanuele Naldi.
Handling censored data is key in reliability. Interval-censoring occurs when failure times are known to fall within a specific interval rather than being observed exactly. Many modern devices are extremely reliable, requiring extended testing under normal conditions. Accelerated life tests speed up failure by increasing stress factors. Classical likelihood-based methods can be affected by data contamination, so robust estimators using distance measures are developed for reliable inference.
Joint work with N. Balakrishnan and L. Pardo.
Some degrading systems have two parts or components whose overall degradation is described by a bivariate process. The correlation is analysed in a bivariate degradation model subject to imperfect maintenance. The underlying degradation is modelled as a Wiener process and the effects of maintenance are assumed to be imperfect, described by an Arithmetic Reduction of Degradation model. This reduces the degradation of the system by an amount proportional to the state just before maintenance.
Joint work with Inma T. Castro, Christophe Bérenguer, Olivier Gaudoin and Laurent Doyen.
This work merges statistical modeling and mathematical optimization to create surrogate optimization models using shape-constrained generalized additive models. A novel framework enables the use of shape-constrained smooth functions in non-parametric regression, aiming to approximate complex functions in mixed-integer nonlinear programming. The shape-constrained models exploit separability, making these problems more tractable.
Joint work with Manuel Navarro-García, María Durbán, Claudia D'Ambrosio and Renan Spencer Trindade.
The shortage of ICU beds highlights the challenges hospitals face with limited resources. Increasing demand forces prioritization, leading to delayed surgeries or early discharges, creating logistical issues. This study develops methodologies based on real data to predict ICU length of stay and improve resource planning. Through simulations, we aim to optimize hospital management, enhancing resource utilization and improving patient outcomes
Joint work with Ana María Anaya-Arenas, Janosch Ortmann, Angel Ruiz and Fermín Mallor.
In this work, we consider a novel ensemble learning framework inspired by modern portfolio optimization in order to address regression problems. Four distinct KKT reformulations of the ensemble framework are considered, taking into account different combinations in the restriction set and the inclusion of a diversification term in the objective function. Additionally, multiple metaheuristics are implemented and the results from all methods are compared against benchmark ensemble methods from the literature.
Joint work with Antonio Manuel Durán Rosal, Natividad González Blanco, Javier Pérez-Rodríguez and Francisco Fernández-Navarro.
We propose a multi-stage stochastic programming model for the optimal participation of energy communities in electricity markets. The multi-stage aspect captures the different times at which variable renewable generation and electricity prices are observed. This results in large-scale optimization problem instances containing large scenario trees with 34 stages, to which scenario reduction techniques are applied. Case studies with real data are discussed to analyse proposed regulatory frameworks in Europe. The added value of considering stochasticity is also analysed.
Joint work with Marlyn D. Cuadrado, F.-Javier Heredia Cervera, and Ignasi Mañé Bosch.
In today's competitive environment, organizations are leveraging incentive models to enhance performance. This talk introduces an incentive model based on Data Envelopment Analysis (DEA), rewarding DMUs that achieve or move toward technical efficiency. Grounded in game theory and Nash equilibrium, the model balances incentives with investment costs. Applicable across sectors, it promotes continuous improvement and efficient resource utilization.
Joint work with Juan Carlos Gonçalves, and Juan Aparicio.
Health emergencies require swift and accurate decisions that save people's lives. This study focuses on the ambulance location-allocation problem in the Basque Country, whose fleet comprises ALS and BLS vehicles. We propose a two-stage stochastic MILP model that maximizes the expected coverage and optimizes three secondary objectives within a hierarchical decision framework. Due to the model's computational challenge, we introduce a mathheuristic algorithm based on primal decomposition.
Joint work with María Merino, and Unai Aldasoro.
Two optimization models are presented to assist in the optimal planning of firefighting helicopters. The first model attempts to solve the optimal planning problem of helicopters working on extinguishing a wildfire. The second problem deals with the assignment of aircraft to wildfires, taking into account the risk of new wildfires starting. Both models are complex, so the optimization problems require the design of metaheuristic algorithms that address good solutions in a reasonable time.
Joint work with María José Ginzo Villamayor, Fernando Pérez Porras, María Luisa Carpente Rodríguez, Silvia María Lorenzo Freire.
We propose a novel Mixed-Integer Non-Linear Optimization formulation to construct a risk score. A trade-off between prediction accuracy and sparsity is sought. Previous approaches are typically designed to handle binary datasets, where numerical predictor variables are discretized in a preprocessing step by using arbitrary thresholds, such as quantiles. In contrast, we allow the model to decide for each continuous predictor variable the particular threshold that is critical for prediction.
Joint work with Claudia D'Ambrosio.