Edit model card

Luminance Transformer

This repository contains code for Luminance(Lumi) Transformer for electroluminescence (EL) images. The model is based on the transformer architecture and is designed to process EL images of solar cells.

Background

EL imaging is a technique used to study solar cells. It involves capturing images of solar cells using a camera sensitive to the near-infrared region of the electromagnetic spectrum. These images show the distribution of charge carriers in the solar cell, which is related to the efficiency of the cell.

The Lumi-transformer model is designed to process EL images and predict the efficiency of the solar cell. It is based on the transformer architecture, which has been shown to be effective for processing sequential data such as natural language text.

Results

The solar-transformer model achieves state-of-the-art performance on the EL image dataset, with an accuracy of 91.7% on a classfication test. (defective or functional)

ELPV dataset link:https://paperswithcode.com/dataset/elpv

ELPV-Monocystalline

Model Recall Precision F1-Score
Lumi-T 0.9191 0.9339 0.9256
VGG-19 0.8529 0.8603 0.8492
ResNet-50 0.8824 0.8855 0.8806

ELPV-Polycystalline

Model Recall Precision F1-Score
Lumi-T 0.9116 0.9509 0.9289
VGG-19 0.8462 0.8729 0.8462
ResNet-50 0.8269 0.8601 0.7951

ELPV-Overall

Model Recall Precision F1-Score
Lumi-T 0.8851 0.9278 0.9170
VGG-19 0.8552 0.8552 0.7885
ResNet-50 0.8049 0.8476 0.8049

ELPV-Transfer Learning (Monocrystalline to Polycrystalline)

Model F1-Score
Lumi-T 0.8202
ResNet-50 0.6103

Acknowledgments

This work was supported by the University of New South Wales KATANA HPC. We would also like to thank the support of GreenDyanmics Pty. Ltd.

Downloads last month
18
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.