Abstract
Multijob production (MJP) is a class of flexible manufacturing systems, which produces different products within the same production system. MJP is widely used in product assembly, and efficient MJP scheduling is crucial for productivity. Most of the existing MJP scheduling methods are inefficient for multijob serial lines with practical constraints. We propose a deep reinforcement learning (DRL)-driven scheduling framework for multijob serial lines by properly considering the practical constraints of identical machines, finite buffers, machine breakdown, and delayed reward. We analyze the starvation and the blockage time, and derive a DRL-driven scheduling strategy to reduce the blockage time and balance the loads. We validate the proposed framework by using real-world factory data collected over six months from a tier-one vendor of a world top-three automobile company. Our case study shows that the proposed scheduling framework improves the average throughput by 24.2% compared with the conventional approach.
Author supplied keywords
Cite
CITATION STYLE
Lee, S., Kim, J., Wi, G., Won, Y., Eun, Y., & Park, K. J. (2024). Deep Reinforcement Learning-Driven Scheduling in Multijob Serial Lines: A Case Study in Automotive Parts Assembly. IEEE Transactions on Industrial Informatics, 20(2), 2932–2943. https://doi.org/10.1109/TII.2023.3292538
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.