Edit model card

Blocksmith

Training Procedure

The T5-small model was fine-tuned on the Minecraft log dataset and a text summarising dataset (Xsum) using the Adam optimizer with a learning rate of 2e-05 for 1 epoch. Early stopping was not implemented.

Model description

Blocksmith is a natural language processing model designed to generate concise summaries of Minecraft logs. It is based on the Transformer architecture, specifically the T5-small model, and trained on a dataset of Minecraft logs.

Intended uses & limitations

Blocksmith is intended for analyzing player behavior, identifying potential issues or bugs, and generating insights for game improvement. However, the model may have limitations in handling specific log formats or game versions, and its summaries might be biased towards the content of the training data.

Training procedure

The T5-small model was fine-tuned on the Minecraft log dataset and a text summarising dataset (Xsum) using the Adam optimizer with a learning rate of 2e-05 for 1 epoch. Early stopping was not implemented.

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
No log 1.0 11 2.8271 34.8098 17.0245 32.5651 32.2774 14.8182

Framework versions

  • Transformers 4.42.4
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
60.5M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for DesilDev/Blocksmith

Base model

google-t5/t5-small
Finetuned
(2)
this model
Finetunes
1 model

Dataset used to train DesilDev/Blocksmith