Text Generation
MLX
Safetensors
llama
HelpingAI
Emotionally Intelligent
EQ
conversational
Edit model card

HelpingAI-9B-mlx

This model was converted to MLX format from OEvortex/HelpingAI-9B. Refer to the original model card for more details on the model.

Overview

HelpingAI-9B is an emotionally intelligent conversational AI designed to engage users with empathy and understanding. It aims to provide supportive, empathetic, and psychologically-grounded responses across a wide range of topics and contexts. This model is ideal for applications requiring emotionally aware and supportive dialogue.

Objectives

  • Engage in open-ended dialogue while displaying emotional intelligence
  • Recognize and validate user emotions and emotional contexts
  • Provide supportive, empathetic, and psychologically-grounded responses
  • Avoid insensitive, harmful, or unethical speech
  • Continuously improve emotional awareness and dialogue skills

Methodology

HelpingAI-9B is based on the HelpingAI series and further trained using:

  • Supervised learning on large dialogue datasets with emotional labeling
  • Reinforcement learning with a reward model favoring emotionally supportive responses
  • Constitution training to instill stable and beneficial objectives
  • Knowledge augmentation from psychological resources on emotional intelligence

Emotional Quotient (EQ)

HelpingAI-9B-mlx has achieved an impressive Emotional Quotient (EQ) of 89.23, surpassing almost all AI models in emotional intelligence. This EQ score reflects its advanced ability to understand and respond to human emotions in a supportive and empathetic manner.

Usage Code

Install mlx-lm

pip install mlx-lm

Load and Use the Model

from mlx_lm import load, generate

model, tokenizer = load("OEvortex/HelpingAI-9B-mlx")
response = generate(model, tokenizer, prompt="hello", verbose=True)

Example Dialogue

Express joy and excitement about visiting a new place.

Yay! I'm super excited to visit this new place! 🎉 It sounds amazing and I can't wait to explore and have fun there! This is going to be such an exciting adventure! 🌟 Let's go, let's do this! ✨

Express feeling down and seeking support.

Oh no, I'm sorry to hear that you're feeling down. It happens sometimes, but remember that it's okay to feel sad. If you want to talk about what's bothering you, I'm here to listen and offer support. We can find a way through this together! 🌈
Downloads last month
10
Safetensors
Model size
1.6B params
Tensor type
FP16
·
U32
·
Inference Examples
Inference API (serverless) does not yet support mlx models for this pipeline type.

Datasets used to train OEvortex/HelpingAI-9B-mlx