Edit model card

You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

ProdocAI/EndConvo-health-1b-GGUF-v1

EndConvo-health-1b-GGUF-v1 is a fine-tuned version of the Llama3.2-1B model, trained on a dataset of healthcare-related conversations with the purpose of identifying whether a conversation has ended. This model helps to avoid unnecessary responses from larger language models by detecting closing statements.

Ollama Integration

Experience seamless integration with Ollama, where the model is fully hosted and ready to run. Simply execute the command ollama run Prodoc/endconvo-health-1b to start utilizing the model's capabilities in identifying conversation endpoints efficiently and effectively. Enjoy the ease of deployment and the power of advanced conversational analysis with Ollama.

Model Details

  • Model Name: EndConvo-health-1b-GGUF-v1
  • Base Model: Llama3.2-1B
  • Number of Parameters: 1 Billion
  • Dataset: Custom dataset of 4,000 rows focused on healthcare conversations
  • Training Data Statistics:
    • Total Conversations: 11,798
    • Chat Count: 94,472
    • Average Chats per Conversation: ~8
    • Languages: Includes en, mr, te, hi, bn, among others (detailed in Language Map section)

Model Objective

The model identifies if a healthcare-related conversation has reached a natural conclusion to prevent unnecessary responses from a larger LLM. The model is trained to output:

  • True: Conversation has ended.
  • False: Conversation is still active.

Dataset Overview

This healthcare-focused conversational dataset includes 11,798 unique conversations, with an average of 8 messages per conversation. The dataset consists of conversations in a variety of languages with the following breakdown:

  • English (en): 78,404 messages
  • Marathi (mr): 2,092 messages
  • Hindi (hi): 2,857 messages
  • ... and others as per the Language Map section.

Example Input Format

Input to the model should be provided in the following format:

"Below is the conversation between the bot and user:

user: Please send me one bottle
bot: Hi, I am Vaidushi and how can I help you today regarding your interest to buy Madhavprash?
bot: Here is the link to purchase your Madhavprash https://madhavprash.store

user: 👆COD not possible here
bot: Currently, we do not support Cash on Delivery (COD) for purchases. You can complete your purchase using other available payment methods on our website.

bot: Thanks for your order, it will be delivered to you within 2-3 working days. Dosage Guidelines...
user: Thanks 🙏🤝 madam..... Kailas Varsekar ..."
Downloads last month
1
GGUF
Model size
1.5B params
Architecture
llama
Inference API
Unable to determine this model's library. Check the docs .

Model tree for ProdocAI/EndConvo-health-1b-GGUF-v1

Quantized
(23)
this model