dc.contributor.author | Satic, Ugur | |
dc.contributor.author | Jacko P. | |
dc.contributor.author | Kirkbride C. | |
dc.date.accessioned | 2024-08-29T08:57:38Z | |
dc.date.available | 2024-08-29T08:57:38Z | |
dc.date.issued | 2024 | en_US |
dc.identifier.issn | 03772217 | |
dc.identifier.uri | https://doi.org/10.1016/j.ejor.2023.10.046 | |
dc.identifier.uri | https://hdl.handle.net/20.500.12573/2359 | |
dc.description.abstract | We consider the dynamic and stochastic resource-constrained multi-project scheduling problem which allows for the random arrival of projects and stochastic task durations. Completing projects generates rewards, which are reduced by a tardiness cost in the case of late completion. Multiple types of resource are available, and projects consume different amounts of these resources when under processing. The problem is modelled as an infinite-horizon discrete-time Markov decision process and seeks to maximise the expected discounted long-run profit. We use an approximate dynamic programming algorithm (ADP) with a linear approximation model which can be used for online decision making. Our approximation model uses project elements that are easily accessible by a decision-maker, with the model coefficients obtained offline via a combination of Monte Carlo simulation and least squares estimation. Our numerical study shows that ADP often statistically significantly outperforms the optimal reactive baseline algorithm (ORBA). In experiments on smaller problems however, both typically perform suboptimally compared to the optimal scheduler obtained by stochastic dynamic programming. ADP has an advantage over ORBA and dynamic programming in that ADP can be applied to larger problems. We also show that ADP generally produces statistically significantly higher profits than common algorithms used in practice, such as a rule-based algorithm and a reactive genetic algorithm. | en_US |
dc.language.iso | eng | en_US |
dc.publisher | Elsevier B.V. | en_US |
dc.relation.isversionof | 10.1016/j.ejor.2023.10.046 | en_US |
dc.rights | info:eu-repo/semantics/openAccess | en_US |
dc.subject | Project scheduling | en_US |
dc.subject | Markov decision processes | en_US |
dc.subject | Approximate dynamic programming | en_US |
dc.subject | Dynamic resource allocation | en_US |
dc.subject | Dynamic programming | en_US |
dc.title | A simulation-based approximate dynamic programming approach to dynamic and stochastic resource-constrained multi-project scheduling problem | en_US |
dc.type | article | en_US |
dc.contributor.department | AGÜ, Mühendislik Fakültesi, Endüstri Mühendisliği Bölümü | en_US |
dc.contributor.authorID | 0000-0002-9160-0006 | en_US |
dc.contributor.institutionauthor | Satic, Ugur | |
dc.identifier.volume | 315 | en_US |
dc.identifier.issue | 2 | en_US |
dc.identifier.startpage | 454 | en_US |
dc.identifier.endpage | 469 | en_US |
dc.relation.journal | European Journal of Operational Research | en_US |
dc.relation.publicationcategory | Makale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı | en_US |