Optimization of laser energy fluence in pulsed laser deposition of ZnO on Al2O3(0001)

W. Yang, R. D. Vispute, S. Choopun, R. P. Sharma, H. Shen, T. Venkatesan

Research output: Contribution to journalConference articlepeer-review


The effects of laser energy fluence on the growth of pulsed laser deposited ZnO thin films on c-plane sapphire substrates were systematically investigated by using x-ray diffraction, Rutherford backscattering spectrometry with ion channeling, and scanning electron microscopy techniques. Optical and electrical properties of the ZnO epilayers were characterized by using ultraviolet-visible transmission spectroscopy and Van der Pauw measurements, respectively. It was found that the laser fluence has strong effects on the crystalline, optical and electrical qualities of the ZnO films. At low laser fluence, ZnO film grows via 3D-island mode with low deposition rate, loss of Zn near the surface and particulates on top of the film. High laser fluence may also cause simultaneous multi-layer growth and the degradation of crystalline, electrical, and optical quality of the ZnO films. The optimal laser fluence window was found between 1.2J/cm2 and 2.5 J/cm2 for obtaining high quality ZnO films for optoelectronic applications. The dependence of laser fluence on the ZnO growth mode, surface morphology and electrical and optical properties is discussed.

Original languageEnglish (US)
Pages (from-to)P11251-P11256
JournalMaterials Research Society Symposium - Proceedings
StatePublished - 2001
Externally publishedYes
EventGrowth, Evolution and Properties of Surfaces, Thin Films and Self-Organized Strutures - Boston, MA, United States
Duration: Nov 27 2000Dec 1 2000

All Science Journal Classification (ASJC) codes

  • Materials Science(all)
  • Condensed Matter Physics
  • Mechanics of Materials
  • Mechanical Engineering


Dive into the research topics of 'Optimization of laser energy fluence in pulsed laser deposition of ZnO on Al<sub>2</sub>O<sub>3</sub>(0001)'. Together they form a unique fingerprint.

Cite this