All Projects → tswsxk → XKT

tswsxk / XKT

Licence: MIT license
Multiple Knowledge Tracing models implemented by mxnet

Programming Languages

python
139335 projects - #7 most used programming language
Makefile
30231 projects

Projects that are alternatives of or similar to XKT

Ncnn
ncnn is a high-performance neural network inference framework optimized for the mobile platform
Stars: ✭ 13,376 (+95442.86%)
Mutual labels:  mxnet
Mx Lsoftmax
mxnet version of Large-Margin Softmax Loss for Convolutional Neural Networks.
Stars: ✭ 175 (+1150%)
Mutual labels:  mxnet
Arcface Multiplex Recognition
适用于复杂场景的人脸识别身份认证系统
Stars: ✭ 200 (+1328.57%)
Mutual labels:  mxnet
Gluon Ts
Probabilistic time series modeling in Python
Stars: ✭ 2,373 (+16850%)
Mutual labels:  mxnet
Crnn Mxnet Chinese Text Recognition
An implementation of CRNN (CNN+LSTM+warpCTC) on MxNet for chinese text recognition
Stars: ✭ 161 (+1050%)
Mutual labels:  mxnet
Xfer
Transfer Learning library for Deep Neural Networks.
Stars: ✭ 177 (+1164.29%)
Mutual labels:  mxnet
Nas
Neural architecture search(NAS)
Stars: ✭ 140 (+900%)
Mutual labels:  mxnet
Deeplearning Cfn
Distributed Deep Learning on AWS Using CloudFormation (CFN), MXNet and TensorFlow
Stars: ✭ 252 (+1700%)
Mutual labels:  mxnet
Coach
Reinforcement Learning Coach by Intel AI Lab enables easy experimentation with state of the art Reinforcement Learning algorithms
Stars: ✭ 2,085 (+14792.86%)
Mutual labels:  mxnet
Bmxnet V2
BMXNet 2: An Open-Source Binary Neural Network Implementation Based on MXNet
Stars: ✭ 199 (+1321.43%)
Mutual labels:  mxnet
Pyeco
python implementation of efficient convolution operators for tracking
Stars: ✭ 150 (+971.43%)
Mutual labels:  mxnet
Mobulaop
A Simple & Flexible Cross Framework Operators Toolkit
Stars: ✭ 161 (+1050%)
Mutual labels:  mxnet
Thinc
🔮 A refreshing functional take on deep learning, compatible with your favorite libraries
Stars: ✭ 2,422 (+17200%)
Mutual labels:  mxnet
Djl
An Engine-Agnostic Deep Learning Framework in Java
Stars: ✭ 2,262 (+16057.14%)
Mutual labels:  mxnet
Netron
Visualizer for neural network, deep learning, and machine learning models
Stars: ✭ 17,193 (+122707.14%)
Mutual labels:  mxnet
Machine Learning Using K8s
Train and Deploy Machine Learning Models on Kubernetes using Amazon EKS
Stars: ✭ 145 (+935.71%)
Mutual labels:  mxnet
Imgclsmob
Sandbox for training deep learning networks
Stars: ✭ 2,405 (+17078.57%)
Mutual labels:  mxnet
digital champions deeplearning r mxnet
Showcase for using R + MXNET along with AWS and bitfusion for deep learning.
Stars: ✭ 20 (+42.86%)
Mutual labels:  mxnet
Fusenet
Deep fusion project of deeply-fused nets, and the study on the connection to ensembling
Stars: ✭ 230 (+1542.86%)
Mutual labels:  mxnet
Gluon Nlp
NLP made easy
Stars: ✭ 2,344 (+16642.86%)
Mutual labels:  mxnet

XKT

PyPI test codecov Download License

Multiple Knowledge Tracing models implemented by mxnet-gluon.

The people who like pytorch can visit the sister projects:

where the previous one is easy-to-understanding and the latter one shares the same architecture with XKT.

For convenient dataset downloading and preprocessing of knowledge tracing task, visit Edudata for handy api.

Tutorial

Installation

  1. First get the repo in your computer by git or any way you like.
  2. Suppose you create the project under your own home directory, then you can use use
    1. pip install -e . to install the package, or
    2. export PYTHONPATH=$PYTHONPATH:~/XKT

Quick Start

To know how to use XKT, readers are encouraged to see

  • examples containing script usage and notebook demo and
  • scripts containing command-line interfaces which can be used to conduct hyper-parameters searching.

Data Format

In XKT, all sequence is store in json format, such as:

[[419, 1], [419, 1], [419, 1], [665, 0], [665, 0]]

Each item in the sequence represent one interaction. The first element of the item is the exercise id and the second one indicates whether the learner correctly answer the exercise, 0 for wrongly while 1 for correctly
One line, one json record, which is corresponded to a learner's interaction sequence.

A demo loading program is presented as follows:

import json
from tqdm import tqdm

def extract(data_src):
    responses = []
    step = 200
    with open(data_src) as f:
        for line in tqdm(f, "reading data from %s" % data_src):
            data = json.loads(line)
            for i in range(0, len(data), step):
                if len(data[i: i + step]) < 2:
                    continue
                responses.append(data[i: i + step])

    return responses

The above program can be found in XKT/utils/etl.py.

To deal with the issue that the dataset is store in tl format:

5
419,419,419,665,665
1,1,1,0,0

Refer to Edudata Documentation.

Citation

If this repository is helpful for you, please cite our work

@inproceedings{tong2020structure,
  title={Structure-based Knowledge Tracing: An Influence Propagation View},
  author={Tong, Shiwei and Liu, Qi and Huang, Wei and Huang, Zhenya and Chen, Enhong and Liu, Chuanren and Ma, Haiping and Wang, Shijin},
  booktitle={2020 IEEE International Conference on Data Mining (ICDM)},
  pages={541--550},
  year={2020},
  organization={IEEE}
}

Appendix

Model

There are a lot of models that implements different knowledge tracing models in different frameworks, the following are the url of those implemented by python (the stared is the authors version):

More models can be found in here

Dataset

There are some datasets which are suitable for this task, you can refer to BaseData ktbd doc for these datasets

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].