_version_ 1852016801658961920
author Ailian Gao (20629841)
author2 Zenglei Liu (20629838)
author2_role author
author_facet Ailian Gao (20629841)
Zenglei Liu (20629838)
author_role author
dc.creator.none.fl_str_mv Ailian Gao (20629841)
Zenglei Liu (20629838)
dc.date.none.fl_str_mv 2025-09-09T17:32:47Z
dc.identifier.none.fl_str_mv 10.1371/journal.pone.0330433.g009
dc.relation.none.fl_str_mv https://figshare.com/articles/figure/The_comparison_results_between_the_LSTKT_models_with_two_different_architectures_/30088480
dc.rights.none.fl_str_mv CC BY 4.0
info:eu-repo/semantics/openAccess
dc.subject.none.fl_str_mv Cancer
Science Policy
Biological Sciences not elsewhere classified
students &# 8217
integrate temporal information
conducted comparison experiments
bidirectional lstm model
series forecasting pipeline
machine learning algorithms
proposed lstkt model
proposed informer model
publicly available dataset
individual knowledge states
informer </ p
achieved promising outcomes
short sequence prediction
probability sparse self
implement knowledge tracing
long sequence time
sparse self
series prediction
knowledge tracing
sequence time
tracing studies
time stamps
time stamp
ednet dataset
assistments2017 dataset
assistments2009 dataset
current knowledge
target exercises
previous approaches
learning performance
extensively utilized
existing models
exercising recordings
decoder architecture
canonical encoder
attention module
attention mechanism
answering records
82 %.
81 %.
dc.title.none.fl_str_mv The comparison results between the LSTKT models with two different architectures.
dc.type.none.fl_str_mv Image
Figure
info:eu-repo/semantics/publishedVersion
image
description <p>The comparison results between the LSTKT models with two different architectures.</p>
eu_rights_str_mv openAccess
id Manara_ea73d4e3d6f9c60fc22bc76fccf00e32
identifier_str_mv 10.1371/journal.pone.0330433.g009
network_acronym_str Manara
network_name_str ManaraRepo
oai_identifier_str oai:figshare.com:article/30088480
publishDate 2025
repository.mail.fl_str_mv
repository.name.fl_str_mv
repository_id_str
rights_invalid_str_mv CC BY 4.0
spelling The comparison results between the LSTKT models with two different architectures.Ailian Gao (20629841)Zenglei Liu (20629838)CancerScience PolicyBiological Sciences not elsewhere classifiedstudents &# 8217integrate temporal informationconducted comparison experimentsbidirectional lstm modelseries forecasting pipelinemachine learning algorithmsproposed lstkt modelproposed informer modelpublicly available datasetindividual knowledge statesinformer </ pachieved promising outcomesshort sequence predictionprobability sparse selfimplement knowledge tracinglong sequence timesparse selfseries predictionknowledge tracingsequence timetracing studiestime stampstime stampednet datasetassistments2017 datasetassistments2009 datasetcurrent knowledgetarget exercisesprevious approacheslearning performanceextensively utilizedexisting modelsexercising recordingsdecoder architecturecanonical encoderattention moduleattention mechanismanswering records82 %.81 %.<p>The comparison results between the LSTKT models with two different architectures.</p>2025-09-09T17:32:47ZImageFigureinfo:eu-repo/semantics/publishedVersionimage10.1371/journal.pone.0330433.g009https://figshare.com/articles/figure/The_comparison_results_between_the_LSTKT_models_with_two_different_architectures_/30088480CC BY 4.0info:eu-repo/semantics/openAccessoai:figshare.com:article/300884802025-09-09T17:32:47Z
spellingShingle The comparison results between the LSTKT models with two different architectures.
Ailian Gao (20629841)
Cancer
Science Policy
Biological Sciences not elsewhere classified
students &# 8217
integrate temporal information
conducted comparison experiments
bidirectional lstm model
series forecasting pipeline
machine learning algorithms
proposed lstkt model
proposed informer model
publicly available dataset
individual knowledge states
informer </ p
achieved promising outcomes
short sequence prediction
probability sparse self
implement knowledge tracing
long sequence time
sparse self
series prediction
knowledge tracing
sequence time
tracing studies
time stamps
time stamp
ednet dataset
assistments2017 dataset
assistments2009 dataset
current knowledge
target exercises
previous approaches
learning performance
extensively utilized
existing models
exercising recordings
decoder architecture
canonical encoder
attention module
attention mechanism
answering records
82 %.
81 %.
status_str publishedVersion
title The comparison results between the LSTKT models with two different architectures.
title_full The comparison results between the LSTKT models with two different architectures.
title_fullStr The comparison results between the LSTKT models with two different architectures.
title_full_unstemmed The comparison results between the LSTKT models with two different architectures.
title_short The comparison results between the LSTKT models with two different architectures.
title_sort The comparison results between the LSTKT models with two different architectures.
topic Cancer
Science Policy
Biological Sciences not elsewhere classified
students &# 8217
integrate temporal information
conducted comparison experiments
bidirectional lstm model
series forecasting pipeline
machine learning algorithms
proposed lstkt model
proposed informer model
publicly available dataset
individual knowledge states
informer </ p
achieved promising outcomes
short sequence prediction
probability sparse self
implement knowledge tracing
long sequence time
sparse self
series prediction
knowledge tracing
sequence time
tracing studies
time stamps
time stamp
ednet dataset
assistments2017 dataset
assistments2009 dataset
current knowledge
target exercises
previous approaches
learning performance
extensively utilized
existing models
exercising recordings
decoder architecture
canonical encoder
attention module
attention mechanism
answering records
82 %.
81 %.