Open Access Open Access  Restricted Access Subscription Access

PromoGen: Advertisement Generation Website using LLM

Rushikesh Bagul, Bhavin Waghela, Ashwin Mahawar, Biswajit Panda, Pravin Shinde Shinde

Abstract


PromoGen is a project that harnesses the power of large language models (LLMs) to automate the generation of concise and impactful ad scripts. With the growing demand for scalable content creation in marketing, PromoGen addresses the need for quick, high-quality promotional content tailored to specific audiences and platforms. Using transformer-based architectures such as Ge, the system generates creative ad copy that aligns with marketing objectives such as tone, style, and length. The core innovation of PromoGen lies in the fine- tuning of LLMs using techniques such as Low-Rank Adaptation (LoRA), which enhances the model’s ability to produce domain- specific content with minimal computational costs. The system is designed to create ad scripts for a variety of platforms, including social media, television, and print media, making it versatile for different marketing contexts. PromoGen’s performance is evaluated through a combination of human evaluations, coherence metrics, and user engagement analysis, ensuring that the generated scripts meet both creative and practical standards. This project contributes to the field of generative AI by expanding the application of LLMs in advertising, a domain traditionally reliant on human creativity. Furthermore, PromoGen explores ethical considerations such as bias mitigation and transparency in AI-generated content. Keywords: PromoGen, large language models (LLMs), advertisement generation, GPT, fine-tuning, Low-Rank Adaptation (LoRA), marketing automation, creative content generation, transformer models, AI in advertising


Full Text:

PDF

References


A. Zhalgasbayev, A. Khauazkhan, and Z. Sarsenova, “Fine-Tuning the Gemma Model for Kaggle Assistant,” 2024.

I. O. William and M. Altamimi, “Large Language Model for Creative Writing and Article Generation,” 2024.

A. Alabdulkarim, S. Li, and X. Peng, “Automatic Story Generation: Challenges and Attempts,” 2021.

J. Li, T. Tang, W. X. Zhao, J.-Y. Nie, and J.-R. Wen, “Pre-trained Language Models for Text Generation: A Survey,” 2022.


Refbacks

  • There are currently no refbacks.