Enhancing E-commerce Chatbots with Falcon-7B and 16-bit Full Quantization

  • Luo Y
  • Wei Z
  • Xu G
  • et al.
N/ACitations
Citations of this article
9Readers
Mendeley users who have this article in their library.

Abstract

E-commerce chatbots play a crucial role in customer service but often struggle with understanding complex queries. This study introduces a breakthrough approach leveraging the Falcon-7B model, a state-of-the-art Large Language Model (LLM) with 7 billion parameters. Trained on a vast dataset of 1,500 billion tokens from RefinedWeb and curated corpora, the Falcon-7B model excels in natural language understanding and generation. Notably, its 16-bit full quantization transformer ensures efficient computation without compromising scalability or performance. By harnessing cutting-edge machine learning techniques, our method aims to redefine e-commerce chatbot systems, providing businesses with a robust solution for delivering personalized customer experiences.

Cite

CITATION STYLE

APA

Luo, Y., Wei, Z., Xu, G., Li, Z., Xie, Y., & Yin, Y. (2024). Enhancing E-commerce Chatbots with Falcon-7B and 16-bit Full Quantization. Journal of Theory and Practice of Engineering Science, 4(02), 52–57. https://doi.org/10.53469/jtpes.2024.04(02).08

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free