Machine Learning Driven Channel Thickness Optimization in Dual-Layer Oxide Thin-Film Transistors for Advanced Electrical Performance

1Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Machine learning (ML) provides temporal advantage and performance improvement in practical electronic device design by adaptive learning. Herein, Bayesian optimization (BO) is successfully applied to the design of optimal dual-layer oxide semiconductor thin film transistors (OS TFTs). This approach effectively manages the complex correlation and interdependency between two oxide semiconductor layers, resulting in the efficient design of experiment (DoE) and reducing the trial-and-error. Considering field effect mobility (Formula presented.) and threshold voltage (Vth) simultaneously, the dual-layer structure designed by the BO model allows to produce OS TFTs with remarkable electrical performance while significantly saving an amount of experimental trial (only 15 data sets are required). The optimized dual-layer OS TFTs achieve the enhanced field effect mobility of 36.1 cm2 V−1 s−1 and show good stability under bias stress with negligible difference in its threshold voltage compared to conventional IGZO TFTs. Moreover, the BO algorithm is successfully customized to the individual preferences by applying the weight factors assigned to both field effect mobility (Formula presented.) and threshold voltage (Vth).

Cite

CITATION STYLE

APA

Lee, J., Lee, J. H., Lee, C., Lee, H., Jin, M., Kim, J., … Kim, Y. S. (2023). Machine Learning Driven Channel Thickness Optimization in Dual-Layer Oxide Thin-Film Transistors for Advanced Electrical Performance. Advanced Science, 10(36). https://doi.org/10.1002/advs.202303589

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free